You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Pitching in just in case this is still an issue. For me, it turned out that the model I was using was not found. I got to this by logging the error returned here. I hope this helps.
Pitching in just in case this is still an issue. For me, it turned out that the model I was using was not found. I discovered this by logging the error returned here. I hope this helps.
I ran into the same issue, and at least in my case, @Waasi was correct. For me, not only was the model not found, but I also could not connect to the Ollama server. I had to correctly set my OLLAMA_HOST environment variable. After that, I got my first MCP server up and running.
So, I believe better logging when this error occurs would be really helpful.
Activity
mimi3421 commentedon Apr 14, 2025
I think there is a Bug somewhere in the request POST JSON message accounting to the
null
string.I tested the mcphost locally with the local llama.cpp server with empty mcpServer setting and sent a simple prumpt "hi". The log in mcphost was:
While the log in llama.cpp was:
I think the
null
in the request JSON should be""
which triggered the error in the JSON parser.Edited:
I'm new to this and I think maybe the empty
reasoning_content
field should be omitted in the request POST and not just be set tonull
.Similar issues may be shown in link or link
Waasi commentedon Apr 14, 2025
Pitching in just in case this is still an issue. For me, it turned out that the model I was using was not found. I got to this by logging the error returned here. I hope this helps.
Add ```omitempty``` to ```reasoning_content``` in OpenAI message API
omitempty
toreasoning_content
in OpenAI message API #27Add ```omitempty``` to ```reasoning_content``` in OpenAI message API (#…
semidark commentedon Apr 19, 2025
I ran into the same issue, and at least in my case, @Waasi was correct. For me, not only was the model not found, but I also could not connect to the Ollama server. I had to correctly set my OLLAMA_HOST environment variable. After that, I got my first MCP server up and running.
So, I believe better logging when this error occurs would be really helpful.