We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你可以自定义翻译服务 :https://hcfy.app/docs/services/custom-api
如果自定义翻译服务满足不了你的需求,欢迎继续留言
Originally posted by @lmk123 in #1992 (comment)
试过了自定义翻译,确实满足不了需求。
关于为什么想要:
虽然llama的翻译质量比不上openai,但本地部署的模型翻译速度甚至比百度翻译还快,还是希望能作为一个参考项。
curl格式如下,为避免内容过长,修改了prompt:
curl http://192.168.0.115:11434/api/generate -d '{ "model": "llama3.2:3b", "prompt":"Hello" }'
响应格式如下:
{"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470425744Z","response":"How","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470429501Z","response":" can","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470468444Z","response":" I","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470469957Z","response":" assist","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470477Z","response":" you","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470478343Z","response":" today","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470484434Z","response":"?","done":false} {"model":"llama3.2:3b","created_at":"2024-10-03T16:15:26.470498901Z","response":"","done":true,"done_reason":"stop","context":[128006,9125,128007,271,38766,1303,33025,2696,25,6790,220,2366,18,271,128009,128006,882,128007,271,9906,128009,128006,78191,128007,271,4438,649,358,7945,499,3432,30],"total_duration":90880784,"load_duration":8884284,"prompt_eval_count":26,"prompt_eval_duration":6156000,"eval_count":8,"eval_duration":30376000}
The text was updated successfully, but these errors were encountered:
自己加了层中转,不需要了
Sorry, something went wrong.
ollama 会支持的,后续进展可以跟进 #2111
No branches or pull requests
如果自定义翻译服务满足不了你的需求,欢迎继续留言
Originally posted by @lmk123 in #1992 (comment)
试过了自定义翻译,确实满足不了需求。
关于为什么想要:
虽然llama的翻译质量比不上openai,但本地部署的模型翻译速度甚至比百度翻译还快,还是希望能作为一个参考项。
curl格式如下,为避免内容过长,修改了prompt:
响应格式如下:
The text was updated successfully, but these errors were encountered: