-
Notifications
You must be signed in to change notification settings - Fork 325
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Handle parallel function calls for OpenAI LLMs #522
Changes from all commits
def04ac
a5c73ec
6ad3437
0499fe4
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -34,7 +34,12 @@ | |
|
||
|
||
async def start_fetch_weather(function_name, llm, context): | ||
await llm.push_frame(TextFrame("Let me check on that.")) | ||
# note: we can't push a frame to the LLM here. the bot | ||
# can interrupt itself and/or cause audio overlapping glitches. | ||
# possible question for Aleix and Chad about what the right way | ||
# to trigger speech is, now, with the new queues/async/sync refactors. | ||
# await llm.push_frame(TextFrame("Let me check on that.")) | ||
logger.debug(f"Starting fetch_weather_from_api with function_name: {function_name}") | ||
|
||
|
||
async def fetch_weather_from_api(function_name, tool_call_id, args, llm, context, result_callback): | ||
|
@@ -106,11 +111,11 @@ async def main(): | |
|
||
pipeline = Pipeline( | ||
[ | ||
fl_in, | ||
# fl_in, | ||
transport.input(), | ||
context_aggregator.user(), | ||
llm, | ||
fl_out, | ||
# fl_out, | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. should we just remove the frame loggers? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Probably. I want to leave them in an example or two so people know they exist, though. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I was thinking the same thing as @chadbailey59 . No reason not to leave them in a few examples as, um, examples. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If we leave them we should uncomment them I think. |
||
tts, | ||
transport.output(), | ||
context_aggregator.assistant(), | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The rule is... if a processor can push frames from a different task then it should be async. But function calls happen in the same task, no? If so, this should be OK.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just realized that, by default, all processors should be async and not the other way around. It makes more sense and it's safer this way. That was my initial idea but then changed my mine. But I think that was the right idea. I'll make the change in the morning and I believe that line should be safe.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will rebase and test again after this change.