Skip to content

Commit

Permalink
[Bugfix][Tracing] Fix the bug that generated message set failed. (#2456)
Browse files Browse the repository at this point in the history
# Description

In current implementation, an error will be raised:
Invalid type ChatCompletionMessage for attribute 'llm.generated_message'
value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a
sequence of those types

This bug is because we forgot to serialize the value.

This pull request includes a modification to the `enrich_span_with_llm`
function in the `src/promptflow-tracing/promptflow/tracing/_trace.py`
file. The change involves using the `serialize_attribute` function on
the `generated_message` before setting it as an attribute of the `span`.
This should help in ensuring that the `generated_message` is in a format
suitable for being set as an attribute.

# All Promptflow Contribution checklist:
- [ ] **The pull request does not introduce [breaking changes].**
- [ ] **CHANGELOG is updated for new features, bug fixes or other
significant changes.**
- [ ] **I have read the [contribution guidelines](../CONTRIBUTING.md).**
- [ ] **Create an issue and link to the pull request to get dedicated
review from promptflow team. Learn more: [suggested
workflow](../CONTRIBUTING.md#suggested-workflow).**

## General Guidelines and Best Practices
- [ ] Title of the pull request is clear and informative.
- [ ] There are a small number of commits, each of which have an
informative message. This means that previously merged commits do not
appear in the history of the PR. For more information on cleaning up the
commits in your PR, [see this
page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md).

### Testing Guidelines
- [ ] Pull request includes test coverage for the included changes.

Co-authored-by: Heyi <heta@microsoft.com>
  • Loading branch information
thy09 and Heyi authored Mar 26, 2024
1 parent 3a9e374 commit d429994
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/promptflow-tracing/promptflow/tracing/_trace.py
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ def enrich_span_with_original_attributes(span, attributes):
def enrich_span_with_llm(span, model, generated_message):
try:
span.set_attribute("llm.response.model", model)
span.set_attribute("llm.generated_message", generated_message)
span.set_attribute("llm.generated_message", serialize_attribute(generated_message))
except Exception as e:
logging.warning(f"Failed to enrich span with llm: {e}")

Expand Down

0 comments on commit d429994

Please sign in to comment.