Skip to content

Conversation

@chenghao-mou
Copy link
Member

@chenghao-mou chenghao-mou commented Nov 28, 2025

Add back otel messages from chat_ctx.

This should close #4116 and the comment

@chenghao-mou chenghao-mou requested a review from a team November 28, 2025 15:43
@theomonnom
Copy link
Member

The tricky thing is how we're going to handle PII on a lot of different places in the future

@ChenghaoMou
Copy link
Contributor

The tricky thing is how we're going to handle PII on a lot of different places in the future

If we do PII redaction inside the ChatContext, we can keep changes to other places minimal, right?

@theomonnom
Copy link
Member

The PII isn't realtime, it's going to be done at the end of the session. (On LiveKit Cloud)
So it's already too late, another version of the chat_ctx is already inside some traces (in the attributes)

@chenghao-mou
Copy link
Member Author

I see. But currently, we are still reporting LLM output, which might also contain PII. Do we intend to:

  1. Keep the codebase as is: no input report, but keep output;
  2. Or drop output too;

Either way, this would create a discrepancy between Agent Observability and Otel traces for the moment. If we are going to support Otel PII handling eventually, I think we should at least provide consistency for now, especially considering users have to explicitly set up Otel today.

I like how Logfire handles this by:

  1. providing a set of predefined regexes for typical sensitive data
  2. providing a callback for more control (but the user takes the responsibility for it being slow/complex)
    , similar to how Langfuse handles sensitive data too.

Copy link
Member

@theomonnom theomonnom left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

agree, consistency for now makes sense

@chenghao-mou chenghao-mou merged commit 51c2870 into main Dec 5, 2025
18 checks passed
@chenghao-mou chenghao-mou deleted the fix/otel-chat-messages branch December 5, 2025 09:56
meetakshay99 added a commit to meetakshay99/agents that referenced this pull request Dec 12, 2025
* main: (267 commits)
  AGT-2328: negative threshold in silero (livekit#4228)
  disable interruptions for agent greeting (livekit#4223)
  feature: GPT-5.2 support (livekit#4235)
  turn-detector: remove english model from readme (livekit#4233)
  add keep alive task for liveavatar plugin (livekit#4231)
  feat(warm-transfer): add sip_number parameter for outbound caller ID (livekit#4216)
  fix blocked send task in liveavatar plugin (livekit#4214)
  clear _q_updated right after await to avoid race conditions (livekit#4209)
  ensure playback_segments_count is consistent in the audio output chain (livekit#4211)
  fix inworld punctuation handling (livekit#4215)
  Inference: Rename fallback model name param (livekit#4202)
  fix race condition when stop background audio play handle (livekit#4197)
  fix watchfiles prevent agent prcoess exit on sigterm (livekit#4194)
  feat(google): add streaming support for Gemini TTS models (livekit#4189)
  Add LiveAvatar Stop Session API Call + README Fix (livekit#4195)
  Fallback API for Inference (livekit#4099)
  feat(rime): expand update_options to accept all TTS parameters (livekit#4095)
  mistralai models update (livekit#4156)
  fix record.exc_info is not pickable when using LogQueueHandler (livekit#4185)
  Restore otel chat message (livekit#4118)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

LLM generation input not part of llm_request span

4 participants