Skip to content

Comments

OpenAI prompt caching: parse and report cached token usage#352

Merged
bug-ops merged 3 commits intomainfrom
feat/m22/openai-prompt-caching
Feb 16, 2026
Merged

OpenAI prompt caching: parse and report cached token usage#352
bug-ops merged 3 commits intomainfrom
feat/m22/openai-prompt-caching

Conversation

@bug-ops
Copy link
Owner

@bug-ops bug-ops commented Feb 16, 2026

Summary

  • Parse cached_tokens from OpenAI response usage.prompt_tokens_details
  • Add Mutex<Option<(u64, u64)>> cache field to OpenAiProvider (mirrors Claude pattern)
  • Implement last_cache_usage() returning (0, cached_tokens) for MetricsCollector
  • 7 unit tests covering deserialization, storage, edge cases, and clone behavior

References

Closes #348, closes #349, closes #350, closes #351

Test plan

  • cargo nextest run --workspace --lib --bins -- 1416 passed
  • cargo clippy --workspace -- -D warnings -- zero warnings
  • Performance validation -- minimal Mutex scope, no allocations in hot path
  • Security validation -- no sensitive data in logs or debug output

Parse cached_tokens from OpenAI response usage.prompt_tokens_details,
store via Mutex-based field mirroring Claude provider pattern, and
expose through last_cache_usage() for MetricsCollector integration.

Closes #349, closes #350, closes #351
@github-actions github-actions bot added documentation Improvements or additions to documentation llm LLM provider related rust size/M labels Feb 16, 2026
@codecov-commenter
Copy link

codecov-commenter commented Feb 16, 2026

Codecov Report

❌ Patch coverage is 93.93939% with 6 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
crates/zeph-llm/src/openai.rs 93.93% 6 Missing ⚠️

Impacted file tree graph

@@            Coverage Diff             @@
##             main     #352      +/-   ##
==========================================
+ Coverage   79.33%   79.38%   +0.05%     
==========================================
  Files         100      100              
  Lines       25665    25764      +99     
==========================================
+ Hits        20361    20454      +93     
- Misses       5304     5310       +6     
Files with missing lines Coverage Δ
crates/zeph-llm/src/openai.rs 76.73% <93.93%> (+2.07%) ⬆️
🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@bug-ops bug-ops enabled auto-merge (squash) February 16, 2026 01:31
@bug-ops bug-ops merged commit 247da78 into main Feb 16, 2026
18 checks passed
@bug-ops bug-ops deleted the feat/m22/openai-prompt-caching branch February 16, 2026 01:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation llm LLM provider related rust size/M

Projects

None yet

2 participants