-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] ollama model provider can not set 'num_ctx' etc. parameter #2442
Comments
itaa
added a commit
to jacksoja/MaxKB
that referenced
this issue
Feb 28, 2025
langchain-openai is not compatible with parameter Settings in ollama, such as num_ctx. Therefore, you need to create model instances using langchain-ollama
Closed
itaa
added a commit
to jacksoja/MaxKB
that referenced
this issue
Feb 28, 2025
…l-dev#2442 langchain-openai is not compatible with parameter Settings in ollama, such as num_ctx. Therefore, you need to create model instances using langchain-ollama
itaa
added a commit
to jacksoja/MaxKB
that referenced
this issue
Feb 28, 2025
…l-dev#2442 langchain-openai is not compatible with parameter Settings in ollama, such as num_ctx. Therefore, you need to create model instances using langchain-ollama (cherry picked from commit 42ae7b4)
shaohuzhang1
pushed a commit
that referenced
this issue
Feb 28, 2025
感谢反馈,下个版本会支持这个参数设置。 |
Thanks for the feedback, this parameter setting will be supported in the next version. |
v1.10.2-lts已发布。 |
v1.10.2-lts has been released. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Contact Information
No response
MaxKB Version
v1.10.1-lts (build at 2025-02-13T18:52, commit: 0640d4c)
Problem Description
ollama model provider can not set 'num_ctx' etc. parameter
Steps to Reproduce
Set num_ctx parameter:
Error

The expected correct result
Content can be returned normally.
Related log output
Additional Information
No response
The text was updated successfully, but these errors were encountered: