Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add LLaMA support #234

Merged
merged 6 commits into from
Apr 12, 2023
Merged

Add LLaMA support #234

merged 6 commits into from
Apr 12, 2023

Conversation

0amp
Copy link
Contributor

@0amp 0amp commented Apr 4, 2023

Description

Add support for all 4 LLaMA models using huggingface as a backend. Requires main branch of transformers repo and accelerate (pip install accelerate git+https://github.com/huggingface/transformers).

I added accelerate to setup.py. I'm not sure how to incorporate the main branch of transformers into the package dependencies.

Fixes #201

Type of change

Please delete options that are not relevant.

  • New feature (non-breaking change which adds functionality)

Checklist:

  • I have commented my code, particularly in hard-to-understand areas
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works (see LLaMA.ipynb demo)
  • New and existing unit tests pass locally with my changes
  • I have not rewritten tests relating to key interfaces which would affect backward compatibility

@0amp 0amp marked this pull request as draft April 4, 2023 19:49
@neelnanda-io
Copy link
Collaborator

neelnanda-io commented Apr 5, 2023 via email

@0amp
Copy link
Contributor Author

0amp commented Apr 5, 2023

Yup! I'm not sure what's up with the upload to huggingface. For some reason, the PR was merged into the main transformers branch and there are at least a half dozen other clones (or fine-tuned versions) on huggingface now anyways so I would bet that it probably won't get taken down. My understanding is that Meta has not said anything new about their license.

Also, is there some easy way to fix the dependencies so that the tests pass?

@neelnanda-io
Copy link
Collaborator

neelnanda-io commented Apr 5, 2023 via email

@0amp
Copy link
Contributor Author

0amp commented Apr 5, 2023

So there's this huggingface/transformers#21796. Looks like the merged pr is just setting up the LLaMA class and that individual people have uploaded the weights. Seems like the HF team isn't going to do anything about it and Meta hasn't said anything about it yet. Also, there's a script for converting meta weights included in the original PR huggingface/transformers#21796 so I could change this PR to just have people download + convert weights themselves.

@neelnanda-io
Copy link
Collaborator

neelnanda-io commented Apr 5, 2023 via email

@0amp 0amp marked this pull request as ready for review April 6, 2023 22:00
@jbloomAus
Copy link
Collaborator

@neelnanda-io I think it sounds like @0amp has completed the PR per your requests, and it is documented/expects users to have legal access to the weights. However, due to the nature of the PR, I'll let you merge it, or if you reply here asking me to, I'll merge it.

@neelnanda-io
Copy link
Collaborator

neelnanda-io commented Apr 12, 2023 via email

@jbloomAus jbloomAus merged commit 3d03ca5 into TransformerLensOrg:main Apr 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add LLaMA support
3 participants