Skip to content

Implement last_cache_usage() and add unit tests for OpenAI cache tracking #351

@bug-ops

Description

@bug-ops

Parent: #348
Depends on: #349, #350

Task

Implement the last_cache_usage() trait method and add comprehensive tests.

Implementation

  1. Add last_cache_usage() override in impl LlmProvider for OpenAiProvider:
fn last_cache_usage(&self) -> Option<(u64, u64)> {
    self.last_cache.lock().ok().and_then(|g| *g)
}
  1. Add unit tests:
  • OpenAiUsage deserialization with and without prompt_tokens_details
  • OpenAiUsage deserialization with cached_tokens: 0
  • PromptTokensDetails deserialization
  • OpenAiChatResponse with usage field present
  • ToolChatResponse with usage field present
  • last_cache_usage() returns None initially
  • Verify (0, cached_tokens) convention

Acceptance criteria

  • last_cache_usage() returns correct values after API call simulation
  • All new tests pass
  • Existing tests unaffected
  • cargo clippy --workspace -- -D warnings clean

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions