Skip to content

examples : evaluate tokens in batches after swapping context #1014

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 21, 2023

Conversation

grencez
Copy link
Contributor

@grencez grencez commented Apr 16, 2023

This new loop around llama_eval is a bit redundant with the batching done in the main loop, but without a refactor it's all still necessary to keep print statements happening at the right times.

@grencez grencez force-pushed the batching branch 5 times, most recently from 26748b2 to 3bc0a89 Compare April 16, 2023 10:22
@grencez grencez changed the title Evaluate tokens in batches after swapping context examples: Evaluate tokens in batches after swapping context Apr 16, 2023
@grencez grencez changed the title examples: Evaluate tokens in batches after swapping context examples : evaluate tokens in batches after swapping context Apr 16, 2023
@grencez grencez marked this pull request as ready for review April 16, 2023 10:30

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
@grencez
Copy link
Contributor Author

grencez commented Apr 17, 2023

Tests passed yesterday. I just synced recent changes and added a comment.

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants