-
Notifications
You must be signed in to change notification settings - Fork 9.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use conversation template for api proxy, fix eventsource format #2383
base: master
Are you sure you want to change the base?
Conversation
Fix eventsource format
Thank you! The PHPStorm plugin I was using, codeGPT didn't work with the main branch api_like_OAI.py. With yours it works smoothly! With the new llama-2 based wizard-13b I finally have a usable local-only assistant that integrates seamlessly in my existing workflows. :D |
The pr has been merged in upstream, and due to the limitation from GitHub (https://github.com/orgs/community/discussions/5634), it seems that I cannot allow editing by maintainer. |
Thank you! The llama-cpp-python generated result is missing some key words |
I am observing this error with a 70B Llama 2 model when attempting to run the guidance tutorial notebook and dropping in
|
In this pr, it adds a
--chat-prompt-model
parameter enables the use of a model registered in fastchat/conversation.py. As model prompt templates, like llama 2, become more intricate, handling them exclusively with tools such as --chat-prompt and --user-name becomes less manageable. Thus, a community-maintained conversation template has been developed as a more user-friendly solution.Currently, the customized system message is pending the merge of lm-sys/FastChat#2069. Yet, the current fschat version should operate without exceptions.
Furthermore, there exists an issue when presenting data in an event-source format. The data must conclude with two
\n
characters, rather than just one\n
, implying the necessity for a blank line that contains only a single\n
character, which is what OpenAI did.