-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[PYD-877] Log OpenAI streaming response at the end instead of opening a span and attaching context in a generator that may not finish #107
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅ 📢 Thoughts on this report? Let us know! |
@willbakst in #110 (comment) you said
so I looked and I can see the same problem in https://github.com/Mirascope/mirascope/blob/5f6ca6038e0041b84dc8939517327d7ad2bcad8b/mirascope/logfire/logfire.py#L174 so you should probably imitate this PR. To demonstrate the problem in mirascope terms, this: from mirascope.logfire import with_logfire
from mirascope.openai import OpenAICall, OpenAICallParams
import logfire
logfire.configure()
@with_logfire
class RecipeRecommender(OpenAICall):
prompt_template = 'Recommend recipes that use {ingredient} as an ingredient'
ingredient: str
call_params = OpenAICallParams(model='gpt-3.5-turbo-0125')
def main():
stream = RecipeRecommender(ingredient='apples').stream()
for chunk in stream:
print(chunk)
break
main() gives the following error before this PR:
so I'm guessing you'll also get similar errors if you stream from a different LLM provider such as Anthropic. |
@alexmojaki thank you for bringing this to my attention! |
0658fdb
to
737b1ba
Compare
Deploying logfire-docs with Cloudflare Pages
|
In the long term I'd like to be able to replace the log with a span that's ended instantly but has a proper start and end timestamp so that it shows a nice duration line in the UI, but that's lower priority.