Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add OLMo family #29885

Closed
2 tasks done
2015aroras opened this issue Mar 26, 2024 · 0 comments · Fixed by #29890
Closed
2 tasks done

Add OLMo family #29885

2015aroras opened this issue Mar 26, 2024 · 0 comments · Fixed by #29890

Comments

@2015aroras
Copy link
Contributor

Model description

OLMo is a series of Open Language Models designed to enable the science of language models. The OLMo models are trained on the Dolma dataset. OLMo is releasing all code, checkpoints, logs (coming soon), and details involved in training these models.

Open source status

  • The model implementation is available
  • The model weights are available

Provide useful links for the implementation

Authored by Allen Institute for AI (HF org: allenai)

Implementation: https://github.com/allenai/OLMo/tree/main/olmo
HF Hub: OLMo-1B, OLMo-7B, OLMo-7B-Twin-2T

Weights in HF formats (.safetensors and .bin) can be found in the respective HF Hub page. Each of these repos has branches containing intermediate checkpoints.
Weights in the original OLMo format can be retrieved following the instructions on the Github page.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant