Skip to content
This repository has been archived by the owner on Jun 5, 2024. It is now read-only.

Bedrock: Update Llm events to store content as token_count on LlmEmbedding and LlmChatCompletionMessage #264

Closed
bizob2828 opened this issue Feb 29, 2024 · 1 comment · Fixed by #269
Assignees
Labels
points:3 A few days

Comments

@bizob2828
Copy link
Member

Description

In order to not break the current API the existing attributes that store content will remain until a later date. The following updates should be made to LlmEmbedding and LlmChatCompletionMessage for bedrock instrumentation:

  • LlmEmbedding - capture the prompt token count as token_count
  • LlmChatCompletionMessage - capture the prompt or completion token count as token_count
@workato-integration
Copy link

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
points:3 A few days
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

2 participants