-
Notifications
You must be signed in to change notification settings - Fork 476
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama: Stream Always Fails #667
Comments
The issue doesn't seem to be with the stream, but rather that the model is not outputting the correct A couple of things to note:
To annotate the result type you can do the following from pydantic import BaseModel, Field
class DOB(BaseModel):
year: int
month: int = Field(ge=1, le=12)
day: int = Field(ge=1, le=31) # Of course would still allow some invalid dates but that can be validated later
class UserProfile(BaseModel):
name: str = Field(description="user's name")
dob: DOB = Field(description="user's date of birth, output json format {year: number, month: number, day: number}")
bio: str = Field(description="Miscellaneous user bio information") Hope this helps! |
Ahh, I was looking at the documentation and I think realize what's going on. So the reason why the example that you shared would work, is because you can construct a datetime.date object from an iso formatted string, which is likely what Pydantic is doing under the hood. If you don't mention that the date should be ISO format, the model is likely trying to output a locale-specific date and thus the parsing doesn't work. So the best solution is likely to mention in the annotation of your dob field to output "ISO formatted dates" class UserProfile(BaseModel):
name: str = Field(description="user's name")
dob: date = Field(description="user's date of birth, ISO format")
bio: str = Field(description="Miscellaneous user bio information") |
Thanks, @SiddarthNarayanan01, for getting back to me. That was just an example. Here’s another one with a similar issue.
I've tried everything, but streaming with Ollama doesn't seem to work. But run method works well. |
The error come from
giving a and self._allow_text_result return false. |
Only stream a structured response doen't work. |
Ah, I misunderstood your question, sorry about that! Interestingly enough, I too have issues getting streaming with Ollama to work, but instead of the errors you are facing I get the entire response in one go (no chunking). Also, the moment the model calls a tool I get an empty response (still no errors). Not sure why this is happening. I'll look at it further. |
Can you reproduce my bug ? |
Yep I can reproduce your bug when running with structured output. Not sure if this is the source of the bug but when a model calls a tool, the TextPart is empty. In streaming mode it seems to just stop there and return that empty string back. That's probably why in your case you get the validation error since the model is returning an empty string when in reality you require a JSON structure. |
I’ve decided to stop using Ollama and have returned to LMStudio. I’ve also submitted a pull request: #705. |
Hi,
Using stream, structured, text with Ollama always fails.
With any models with tools.
Code:
Error:
The text was updated successfully, but these errors were encountered: