Skip to content

Add OLMo family #29885

@2015aroras

Description

@2015aroras

Model description

OLMo is a series of Open Language Models designed to enable the science of language models. The OLMo models are trained on the Dolma dataset. OLMo is releasing all code, checkpoints, logs (coming soon), and details involved in training these models.

Open source status

  • The model implementation is available
  • The model weights are available

Provide useful links for the implementation

Authored by Allen Institute for AI (HF org: allenai)

Implementation: https://github.com/allenai/OLMo/tree/main/olmo
HF Hub: OLMo-1B, OLMo-7B, OLMo-7B-Twin-2T

Weights in HF formats (.safetensors and .bin) can be found in the respective HF Hub page. Each of these repos has branches containing intermediate checkpoints.
Weights in the original OLMo format can be retrieved following the instructions on the Github page.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions