-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[RFE] Additonal messages on prompt #74
Comments
Based on the private feedback. This is a raw example using few-shot example:
When using this, the output is like this, the LLM "learn" how to respond correctly and with clear structured data:
When only the system prompt can be used, the output differs a bit:
Why we need thisWhen the LLM needs to generate something more than summarize or guide on the current docs, the LLM app needs more context, and large windown context. For our use case, we want to build a very complex json workflow, so we need to give a few examples on how to translate from user input to json. Examples that we give: Current real demo using custom backend that uses this tech: https://drive.google.com/file/d/1hEQma6zFCgm0YIFHESjNjaRjj2KqFgkf/view?usp=drive_link Does this break the current status?Big no! The original request is like this:
And with my proposed changes, will be something like this:
So current users will not have to change the code, and this change will allow new users which needs more "strict" output to build their AI apps on top of this backend. |
Type: feature-request
Description:
When using complex scenarios, we need to teach the LLM how to generate the
response. This technique is a few-shot prompting, where a few messages are part
of the prompt before the human question.
Ollama implemented this using the message array, where the end user define it:
https://github.com/ollama/ollama/blob/dc6fe820512d1046f3a342e384baa64b8ce1758c/docs/api.md?plain=1#L451-L457
I think that in this case, it'll be cool to use something similar which can be appended here:
service/docs/openapi.json
Lines 509 to 595 in 5360e36
Another example can be found on PDL(prompt declaration language) project:
https://github.com/IBM/prompt-declaration-language/blob/572373a09e2d105cf6712859d4be5fb371ba1051/examples/tutorial/calling_llm_with_input_messages.pdl#L5-L10
This, as far as I know:
Steps needed
Questions:
The text was updated successfully, but these errors were encountered: