-
Notifications
You must be signed in to change notification settings - Fork 687
Add a parameter to pass tokenizer to llama QNN runner #12285
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/12285
Note: Links to docs will display an error until the docs builds have been completed. ✅ You can merge normally! (2 Unrelated Failures)As of commit 0523319 with merge base 1ae4389 ( BROKEN TRUNK - The following jobs failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D77910880 |
"failed to load tokenizer %s", | ||
tokenizer_path_.c_str()); | ||
if (tokenizer_ != nullptr) { | ||
eos_ids->insert(200008); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is this 200008? It seems very model/tokenizer specific.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What is this 200008? It seems very model specific.
Replied in phabricator.
@pytorchbot label "release notes: none" |
Summary: Pull Request resolved: pytorch#12285 Reviewed By: billmguo Differential Revision: D77910880
40e1766
to
747d41b
Compare
This pull request was exported from Phabricator. Differential Revision: D77910880 |
Summary: Pull Request resolved: pytorch#12285 Reviewed By: billmguo Differential Revision: D77910880
747d41b
to
e997fec
Compare
This pull request was exported from Phabricator. Differential Revision: D77910880 |
Summary: Pull Request resolved: pytorch#12285 Reviewed By: billmguo Differential Revision: D77910880
e997fec
to
f17d3af
Compare
This pull request was exported from Phabricator. Differential Revision: D77910880 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for addressing the comments!
"failed to load tokenizer %s", | ||
tokenizer_path_.c_str()); | ||
} else { | ||
if (tokenizer_ != nullptr) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@limintang maybe let's add a TODO or comment here such that others know the context. @haowhsu-quic @shewu-quic this is for supporting an internal model, feel free to let us know your thoughts and we can address it.
Summary: Pull Request resolved: pytorch#12285 Reviewed By: billmguo, cccclai Differential Revision: D77910880
f17d3af
to
17bed46
Compare
This pull request was exported from Phabricator. Differential Revision: D77910880 |
Summary: Pull Request resolved: pytorch#12285 Reviewed By: billmguo, cccclai Differential Revision: D77910880
17bed46
to
0523319
Compare
This pull request was exported from Phabricator. Differential Revision: D77910880 |
Differential Revision: D77910880 Pull Request resolved: pytorch#12285
Differential Revision: D77910880