Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix #1119, no content when streaming. #1122

Merged
merged 1 commit into from
Dec 14, 2023
Merged

Conversation

emsi
Copy link
Contributor

@emsi emsi commented Dec 13, 2023

Workaround for #1119 , ensure we make non-streaming request.

p.s. When running under debugger the original code (without this PR) works. It looks like adding some delay makes it enough for the response to come in full. This indicates a bigger problem with the code and the way it collects response. To put it bluntly the whole ollama code looks quite messy. I even observed the current streaming code to sometimes just return a couple of tokens when repeating the same request fast in a succession.

Copy link

vercel bot commented Dec 13, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Dec 13, 2023 8:58pm

@krrishdholakia
Copy link
Contributor

Hey @emsi forgot to update the ticket - I fixed it already: 7b8851c

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants