Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PYD-877] Log OpenAI streaming response at the end instead of opening a span and attaching context in a generator that may not finish #107

Merged
merged 4 commits into from
May 4, 2024

Conversation

alexmojaki
Copy link
Contributor

In the long term I'd like to be able to replace the log with a span that's ended instantly but has a proper start and end timestamp so that it shows a nice duration line in the UI, but that's lower priority.

@alexmojaki alexmojaki requested review from samuelcolvin and Kludex May 3, 2024 14:30
Copy link

linear bot commented May 3, 2024

Copy link

codecov bot commented May 3, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

📢 Thoughts on this report? Let us know!

@alexmojaki
Copy link
Contributor Author

@willbakst in #110 (comment) you said

We've already implemented the majority of it in our library following how you instrument openai

so I looked and I can see the same problem in https://github.com/Mirascope/mirascope/blob/5f6ca6038e0041b84dc8939517327d7ad2bcad8b/mirascope/logfire/logfire.py#L174 so you should probably imitate this PR.

To demonstrate the problem in mirascope terms, this:

from mirascope.logfire import with_logfire
from mirascope.openai import OpenAICall, OpenAICallParams

import logfire

logfire.configure()


@with_logfire
class RecipeRecommender(OpenAICall):
    prompt_template = 'Recommend recipes that use {ingredient} as an ingredient'

    ingredient: str

    call_params = OpenAICallParams(model='gpt-3.5-turbo-0125')


def main():
    stream = RecipeRecommender(ingredient='apples').stream()
    for chunk in stream:
        print(chunk)
        break


main()

gives the following error before this PR:

Failed to detach context
Traceback (most recent call last):
  File "/home/alex/work/platform/.venv/lib/python3.12/site-packages/logfire/_internal/integrations/openai.py", line 89, in __stream__
    yield chunk
GeneratorExit

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/alex/work/platform/.venv/lib/python3.12/site-packages/opentelemetry/context/__init__.py", line 158, in detach
    _RUNTIME_CONTEXT.detach(token)  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/alex/work/platform/.venv/lib/python3.12/site-packages/opentelemetry/context/contextvars_context.py", line 50, in detach
    self._current_context.reset(token)  # type: ignore
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
ValueError: <Token var=<ContextVar name='current_context' default={} at 0x7f444b347f10> at 0x7f4425b50480> was created in a different Context

so I'm guessing you'll also get similar errors if you stream from a different LLM provider such as Anthropic.

@willbakst
Copy link
Contributor

@alexmojaki thank you for bringing this to my attention!

@alexmojaki alexmojaki force-pushed the alex/openai-generators branch from 0658fdb to 737b1ba Compare May 4, 2024 15:44
Copy link

cloudflare-workers-and-pages bot commented May 4, 2024

Deploying logfire-docs with  Cloudflare Pages  Cloudflare Pages

Latest commit: 737b1ba
Status: ✅  Deploy successful!
Preview URL: https://d1dcd642.logfire-docs.pages.dev
Branch Preview URL: https://alex-openai-generators.logfire-docs.pages.dev

View logs

@alexmojaki alexmojaki enabled auto-merge (squash) May 4, 2024 15:44
@alexmojaki alexmojaki merged commit 17813c9 into main May 4, 2024
11 checks passed
@alexmojaki alexmojaki deleted the alex/openai-generators branch May 4, 2024 15:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants