Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama models are not respecting the system prompt / context #18

Open
lfnovo opened this issue Nov 13, 2024 · 0 comments
Open

Ollama models are not respecting the system prompt / context #18

lfnovo opened this issue Nov 13, 2024 · 0 comments

Comments

@lfnovo
Copy link
Owner

lfnovo commented Nov 13, 2024

When using Ollama as the chat models, for some reason, they are not recognizing the context in the system prompt, rendering them useless for the Chat purpose.

I plan to investigate more on this issue soon, but if you are familiar with using Ollama with Langchain, I'd appreciate your help. For the time being, I recommend using other providers for chat.

Ollama models work great for transformations and can be used with no issues there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant