Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Function Calling #29

Merged
merged 13 commits into from
Aug 27, 2023
Merged

Function Calling #29

merged 13 commits into from
Aug 27, 2023

Conversation

AdritRao
Copy link
Collaborator

@AdritRao AdritRao commented Aug 7, 2023

Function Calling

♻️ Current situation & Problem

Currently, when the user taps "Chat with all resources", all FHIR resources are sent to the LLM. This can lead to token limit errors with the API. By using function calling (#22), we can enable a structured interaction with health records (https://platform.openai.com/docs/guides/gpt/faq).

💡 Proposed solution

The function get_resource_titles is added to the LLM. The LLM uses this function to find out which resource(s) are applicable to the user's question. Based on the function calling response, the corresponding JSON data is given to the LLM. This way, the LLM is only given FHIR resources based on the user's question. Additionally, instead of providing all resources to start the interaction, the interpreter is only fed the titles and categories of all resources to provide an overall summary.

⚙️ Release Notes

One of the main additions to the code was the ChatFunctionDeclaration for get_resource_titles in the OpenAIChatView.swift:

        let functions = [
            ChatFunctionDeclaration(
                name: "get_resource_titles",
                description: String(localized: "MULTIPLE_RESOURCE_FUNCTION_DESCRIPTION"),
                parameters: JSONSchema(
                    type: .object,
                    properties: [
                        "resources": .init(
                            type: .string,
                            description: String(localized: "MULTIPLE_RESOURCE_PARAMETER_DESCRIPTION"),
                            enumValues: stringResourcesArray
                        )
                    ],
                    required: ["resources"]
                )
            )
        ]

The chat view was also abstracted into various functions to handle function calling, parsing the JSON response from the API, and sending the LLM the relevant resources based on the function calling.

The multiple resource interpreter prompt was modified and three additional prompts were added for the function calling: MULTIPLE_RESOURCE_FUNCTION_DESCRIPTION, MULTIPLE_RESOURCE_PARAMETER_DESCRIPTION, and MULTIPLE_RESOURCE_FUNCTION_CONTEXT in the Localizable.strings file.

Code of Conduct & Contributing Guidelines

By submitting creating this pull request, you agree to follow our Code of Conduct and Contributing Guidelines:

Copy link
Member

@PSchmiedmayer PSchmiedmayer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the good first draft of the function calling API integration @AdritRao! It has all the necessary pieces in place, I just had a few comments on the general structure and some OpenAI-specific features that can be improved to fully reflect the API spec and to ensure that we have a good reusability of the code across the two uses cases where we have function calls wand were we won't use them.

Let me know if you have any questions regrind the specific comments, happy to help and support in any way I can to get this PR merged 🚀

LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Show resolved Hide resolved
@AdritRao
Copy link
Collaborator Author

Hi @PSchmiedmayer, thank you for your comments! I have addressed them and also added the support for function calling using the updated Spezi ML module (StanfordSpezi/SpeziLLM#22). The main change after the comments was the addition of the custom ChatFunctionCall class and populating it based on the chat stream from Spezi ML.

LLMonFHIR/Resources/en.lproj/Localizable.strings Outdated Show resolved Hide resolved
LLMonFHIR/LLMonFHIRDelegate.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/FHIRResourcesView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
LLMonFHIR/FHIR Display/OpenAIChatView.swift Outdated Show resolved Hide resolved
@AdritRao
Copy link
Collaborator Author

Hi @PSchmiedmayer, thank you for your feedback! I have addressed the comments in the latest commit. I modified the code so that the .system message with function context is provided at the beginning of the chat. The function call is now performed with the full chat as input. I have done some testing and so far, the LLM is able to prevent itself from re-querying the same records!

Copy link
Member

@PSchmiedmayer PSchmiedmayer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the improvements @AdritRao

I took a look at the PR and made some smaller refinements and code improvements that should not hold us up from merging the PR with doing an other round of feedback 👍

You can see the diff of my changes here: https://github.com/StanfordBDHG/LLMonFHIR/pull/29/files/a69006baa6632549441d85e94229351f0b41ccf8..bc86ecbe757ad8995c0aa8a6a0e1a0770102aec5 ... the biggest diff comes from reducing the indent a bit by using guard statements and removing code where Xcode warned us that types and properties were unused.

I have approved the PR in its current state and you can merge it at any time that works for you 🎉

Thank you for all the work and effort that went into the PR and improving LLMonFHIR!

@PSchmiedmayer PSchmiedmayer marked this pull request as ready for review August 27, 2023 01:45
@AdritRao AdritRao merged commit bbe5604 into main Aug 27, 2023
3 checks passed
@AdritRao AdritRao deleted the function_calling branch August 27, 2023 17:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants