-
Notifications
You must be signed in to change notification settings - Fork 220
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Llama/Llama2 support in Question-Answering #745
Conversation
Fixed issue reported GS-23: missing LlamaForQuestionAnswering class issue reported is fixed by applying huggingface/transformers 28777. Fixed run_qa.py script to support Llama tokenizer. |
|
6f1f112
to
61a701f
Compare
Verified with updated transformer v4.38.2 #788, work fine. |
61a701f
to
fe4ea04
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you update the main branch of your fork and rebase this PR on it please?
fe4ea04
to
6ad6b74
Compare
rebased |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
I pushed two additional commits to refine a bit the README and run make style
.
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
Added Llama/Llama2 support in Question-Answering