You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Pythia is a suite of 16 LLMs all trained on public data seen in the exact same order and ranging in size from 70M to 12B parameters. The model was developed with intention to facilitate research in many areas. That's why I think this would be a good addition to KerasNLP. I'll work on adding following as a part of Google Summer of Code
Pythia is a suite of 16 LLMs all trained on public data seen in the exact same order and ranging in size from 70M to 12B parameters. The model was developed with intention to facilitate research in many areas. That's why I think this would be a good addition to KerasNLP. I'll work on adding following as a part of Google Summer of Code
GPTNeoXBackbone
#1056GPTNeoXTokenizer
#1085The text was updated successfully, but these errors were encountered: