We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
latest
c3b232
No response
Why is top_p assigned to top_k on line 330 of rag/llm/chat_model.py?
The text was updated successfully, but these errors were encountered:
Ollama seems only accept topk.
Sorry, something went wrong.
It can accept the top_p parameter; I’ve tested it. reference link: https://github.com/ollama/ollama/blob/ce1fb4447efc9958dcf279f7eb2ae6941bec1220/examples/python-json-datagenerator/readme.md?plain=1#L45
Pass top_p to ollama. Close infiniflow#1769
b9763a4
d94386e
Successfully merging a pull request may close this issue.
Is there an existing issue for the same bug?
Branch name
latest
Commit ID
c3b232
Other environment information
No response
Actual behavior
Why is top_p assigned to top_k on line 330 of rag/llm/chat_model.py?
Expected behavior
No response
Steps to reproduce
Additional information
No response
The text was updated successfully, but these errors were encountered: