-
Notifications
You must be signed in to change notification settings - Fork 3.2k
fix(ollama): fixed messages array for ollama, added gpt-5.2 #2315
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub. |
Greptile OverviewGreptile SummaryFixed a bug in Ollama provider integration where messages were being duplicated, and added support for the new Key Changes:
Confidence Score: 5/5
Important Files ChangedFile Analysis
Sequence DiagramsequenceDiagram
participant Agent as AgentBlockHandler
participant Builder as buildProviderRequest
participant Provider as executeProviderRequest
participant Ollama as OllamaProvider
Agent->>Builder: buildProviderRequest(config)
Builder->>Builder: validateMessages(messages)
alt validMessages = true
Note over Builder: Messages are valid<br/>Set context = undefined<br/>Set systemPrompt = undefined
Builder->>Provider: { messages, context: undefined }
Provider->>Ollama: { messages: [...] }
Ollama->>Ollama: allMessages.push(...messages)
Note over Ollama: No duplicate messages
else validMessages = false
Note over Builder: Messages invalid<br/>Set context = stringifyJSON(messages)<br/>Use systemPrompt
Builder->>Provider: { context, systemPrompt }
Provider->>Ollama: { context: "..." }
Ollama->>Ollama: allMessages.push({ role: 'user', content: context })
Note over Ollama: Context added as user message
end
Ollama-->>Provider: Response
Provider-->>Agent: Result
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Additional Comments (1)
-
apps/sim/providers/ollama/index.ts, line 110-124 (link)logic: removed
request.contexthandling that exists in all other providers - this breaks context passingAll other providers (OpenAI, Anthropic, Google, etc.) handle
request.contextby inserting it as a user message between the system prompt and the messages array. Removing this from Ollama breaks parity and will cause context to be ignored when using Ollama.
2 files reviewed, 1 comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
2 files reviewed, no comments
Summary
Type of Change
Testing
Tested manually
Checklist