Technology
Olmo-3
The Allen Institute for AI’s latest open-source language model featuring 440 billion tokens of training and full pipeline transparency.
OLMo-2 (the architecture driving the Olmo-3 series) delivers a state-of-the-art open language model framework built by the Allen Institute for AI (AI2). This iteration prioritizes data integrity and reproducibility: providing the full training code, weights, and the Dolma dataset (3 trillion tokens). By utilizing a 7-billion parameter dense architecture, it matches or exceeds Llama 3 performance on benchmarks like MMLU and GSM8K while remaining entirely accessible for academic and commercial audit.
1 project
·
1 city
Related technologies
Recent Talks & Demos
Showing 1-1 of 1