Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[Bugfix][Tracing] Fix the bug that generated message set failed. (#2456)
# Description In current implementation, an error will be raised: Invalid type ChatCompletionMessage for attribute 'llm.generated_message' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types This bug is because we forgot to serialize the value. This pull request includes a modification to the `enrich_span_with_llm` function in the `src/promptflow-tracing/promptflow/tracing/_trace.py` file. The change involves using the `serialize_attribute` function on the `generated_message` before setting it as an attribute of the `span`. This should help in ensuring that the `generated_message` is in a format suitable for being set as an attribute. # All Promptflow Contribution checklist: - [ ] **The pull request does not introduce [breaking changes].** - [ ] **CHANGELOG is updated for new features, bug fixes or other significant changes.** - [ ] **I have read the [contribution guidelines](../CONTRIBUTING.md).** - [ ] **Create an issue and link to the pull request to get dedicated review from promptflow team. Learn more: [suggested workflow](../CONTRIBUTING.md#suggested-workflow).** ## General Guidelines and Best Practices - [ ] Title of the pull request is clear and informative. - [ ] There are a small number of commits, each of which have an informative message. This means that previously merged commits do not appear in the history of the PR. For more information on cleaning up the commits in your PR, [see this page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md). ### Testing Guidelines - [ ] Pull request includes test coverage for the included changes. Co-authored-by: Heyi <heta@microsoft.com>
- Loading branch information