-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Function Calling #29
Function Calling #29
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the good first draft of the function calling API integration @AdritRao! It has all the necessary pieces in place, I just had a few comments on the general structure and some OpenAI-specific features that can be improved to fully reflect the API spec and to ensure that we have a good reusability of the code across the two uses cases where we have function calls wand were we won't use them.
Let me know if you have any questions regrind the specific comments, happy to help and support in any way I can to get this PR merged 🚀
Hi @PSchmiedmayer, thank you for your comments! I have addressed them and also added the support for function calling using the updated Spezi ML module (StanfordSpezi/SpeziLLM#22). The main change after the comments was the addition of the custom |
Hi @PSchmiedmayer, thank you for your feedback! I have addressed the comments in the latest commit. I modified the code so that the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for the improvements @AdritRao
I took a look at the PR and made some smaller refinements and code improvements that should not hold us up from merging the PR with doing an other round of feedback 👍
You can see the diff of my changes here: https://github.com/StanfordBDHG/LLMonFHIR/pull/29/files/a69006baa6632549441d85e94229351f0b41ccf8..bc86ecbe757ad8995c0aa8a6a0e1a0770102aec5 ... the biggest diff comes from reducing the indent a bit by using guard statements and removing code where Xcode warned us that types and properties were unused.
I have approved the PR in its current state and you can merge it at any time that works for you 🎉
Thank you for all the work and effort that went into the PR and improving LLMonFHIR!
Function Calling
♻️ Current situation & Problem
Currently, when the user taps "Chat with all resources", all FHIR resources are sent to the LLM. This can lead to token limit errors with the API. By using function calling (#22), we can enable a structured interaction with health records (https://platform.openai.com/docs/guides/gpt/faq).
💡 Proposed solution
The function
get_resource_titles
is added to the LLM. The LLM uses this function to find out which resource(s) are applicable to the user's question. Based on the function calling response, the corresponding JSON data is given to the LLM. This way, the LLM is only given FHIR resources based on the user's question. Additionally, instead of providing all resources to start the interaction, the interpreter is only fed the titles and categories of all resources to provide an overall summary.⚙️ Release Notes
One of the main additions to the code was the
ChatFunctionDeclaration
forget_resource_titles
in theOpenAIChatView.swift
:The chat view was also abstracted into various functions to handle function calling, parsing the JSON response from the API, and sending the LLM the relevant resources based on the function calling.
The multiple resource interpreter prompt was modified and three additional prompts were added for the function calling:
MULTIPLE_RESOURCE_FUNCTION_DESCRIPTION
,MULTIPLE_RESOURCE_PARAMETER_DESCRIPTION
, andMULTIPLE_RESOURCE_FUNCTION_CONTEXT
in theLocalizable.strings
file.Code of Conduct & Contributing Guidelines
By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines: