-
Notifications
You must be signed in to change notification settings - Fork 2.5k
feat: Adds telemetry and fixes usage metadata for live mode #2325
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA). View this failed invocation of the CLA check for more information. For the most up to date status, view the checks section at the bottom of the pull request. |
ravi-incred
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
when are we planning to merge this PR?
|
This doesn't appear to have been merged into the adk-python repository yet. Do you have any information on when it might be merged? Thanks! |
| span = trace.get_current_span() | ||
| # Special standard Open Telemetry GenaI attributes that indicate | ||
| # that this is a span related to a Generative AI system. | ||
| # For live events with usage metadata, create a new span for each event |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hm what's the reasoning for creating a new span for every live event? just wondering if it would it cause too much overhead by generating too many spans
|
thanks for creating this PR! can you resolve the merge conflicts as well? |
| content=content, interrupted=message.server_content.interrupted | ||
| content=content, | ||
| interrupted=message.server_content.interrupted, | ||
| usage_metadata=self._fix_usage_metadata( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will this cause duplicated usage metadata since we're adding it to all the LlmResponses? i.e. in the case that a message contains both content.parts and message.server_content.input_transcription
Fix #2323