-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[extensions/openai] Failure to load instruction-following template for model #3308
Comments
Confirmed when loading the model via the UI. Thanks for the report, the shared settings are not being updated any more it seems. As a workaround, It works if loading a model via the API or via the startup --model command. |
I encountered the same problem: the openai extension does not refresh the shared.settings. |
Thanks for the response. Can you give an example (preferably in python) of how the model should be loaded through the API? Using "openai.ChatCompletion.create" with a model name doesn't seem to work. I tried the "/v1/engines/{model_name}" endpoint but got the following error when sending a request there:
|
There is an example in the api-examples/ folder, which includes a few robust options for loading models. This is essentially what I use myself. As you found, there is a bug in the (simple) model loaded hacked into the openai extension (which is also usable from the command line, like: openai api models.get -i TheBloke_WizardLM-7B-uncensored-GPTQ). That bug (#3305) is fixed in this PR: #3309 |
Thanks, loading using the (non extension) API worked. Great extension by the way, thank you for working on it! Makes it very easy to port an app using the openAI API to a local model. I just hope the issues with the UI are fixed soon, since a very common use case is to run the web UI to play with various options on the fly while sending requests through an app. |
This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment. |
Describe the bug
Always getting the following error for every model, despite the model having an instruction template correctly selected in the UI:
Getting this error even if the model has a correct template in "characters/instruction-following" that is automatically loaded and matches a pattern in "models/config.yaml".
Looking at the code, it seems the following line in "completions.py" doesn't work as expected, with "shared.settings['instruction_template']" always being "None":
instruct = yaml.safe_load(open(f"characters/instruction-following/{shared.settings['instruction_template']}.yaml", 'r'))
If it is replaced with (for example):
instruct = yaml.safe_load(open(f"characters/instruction-following/Alpaca.yaml", 'r'))
Then the template is loaded correctly.
Is there an existing issue for this?
Reproduction
This error happens for every model once the API request is sent. For example the model: "TheBloke/Llama-2-13B-chat-GGML"
The model fits the pattern that exists in "config.yaml":
"Llama-v2" can also be selected as the instruction template in the UI (and is in fact automatically selected when loading the model).
However when sending an API request, the error appears.
Screenshot
No response
Logs
System Info
The text was updated successfully, but these errors were encountered: