Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(llama-index): extract token counts for groq when streaming #1174

Merged
merged 3 commits into from
Dec 17, 2024

Conversation

RogerHYang
Copy link
Contributor

@RogerHYang RogerHYang commented Dec 16, 2024

@RogerHYang RogerHYang requested a review from a team as a code owner December 16, 2024 17:05
@dosubot dosubot bot added the size:S This PR changes 10-29 lines, ignoring generated files. label Dec 16, 2024
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:S This PR changes 10-29 lines, ignoring generated files. labels Dec 16, 2024
@RogerHYang RogerHYang changed the title fix: extract token counts for groq when streaming fix(llama-index): extract token counts for groq when streaming Dec 17, 2024
@RogerHYang RogerHYang merged commit 0aafe9c into main Dec 17, 2024
4 checks passed
@RogerHYang RogerHYang deleted the fix-groq-token-count branch December 17, 2024 21:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
size:L This PR changes 100-499 lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

[BUG] Used token not calculating when streaming - Llamaindex
2 participants