-
Notifications
You must be signed in to change notification settings - Fork 32.5k
Closed
Labels
Description
Model description
OLMo is a series of Open Language Models designed to enable the science of language models. The OLMo models are trained on the Dolma dataset. OLMo is releasing all code, checkpoints, logs (coming soon), and details involved in training these models.
Open source status
- The model implementation is available
- The model weights are available
Provide useful links for the implementation
Authored by Allen Institute for AI (HF org: allenai)
Implementation: https://github.com/allenai/OLMo/tree/main/olmo
HF Hub: OLMo-1B, OLMo-7B, OLMo-7B-Twin-2T
Weights in HF formats (.safetensors and .bin) can be found in the respective HF Hub page. Each of these repos has branches containing intermediate checkpoints.
Weights in the original OLMo format can be retrieved following the instructions on the Github page.
Reactions are currently unavailable