Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]:top_p is assigned to top_k #1769

Closed
1 task done
adzhua opened this issue Aug 1, 2024 · 2 comments · Fixed by #3744
Closed
1 task done

[Bug]:top_p is assigned to top_k #1769

adzhua opened this issue Aug 1, 2024 · 2 comments · Fixed by #3744
Labels
bug Something isn't working

Comments

@adzhua
Copy link

adzhua commented Aug 1, 2024

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Branch name

latest

Commit ID

c3b232

Other environment information

No response

Actual behavior

Why is top_p assigned to top_k on line 330 of rag/llm/chat_model.py?

Expected behavior

No response

Steps to reproduce

Why is top_p assigned to top_k on line 330 of rag/llm/chat_model.py?

Additional information

No response

@adzhua adzhua added the bug Something isn't working label Aug 1, 2024
@KevinHuSh
Copy link
Collaborator

Ollama seems only accept topk.

@adzhua
Copy link
Author

adzhua commented Aug 2, 2024

Ollama seems only accept topk.

It can accept the top_p parameter; I’ve tested it.
reference link: https://github.com/ollama/ollama/blob/ce1fb4447efc9958dcf279f7eb2ae6941bec1220/examples/python-json-datagenerator/readme.md?plain=1#L45

yuzhichang added a commit to yuzhichang/ragflow that referenced this issue Nov 29, 2024
@yuzhichang yuzhichang mentioned this issue Nov 29, 2024
1 task
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants