Skip to content

Add OpenAI usage deserialization structs and last_cache field #349

@bug-ops

Description

@bug-ops

Parent: #348

Task

Add the necessary structs and field to OpenAiProvider for tracking prompt cache usage.

Implementation

In crates/zeph-llm/src/openai.rs:

  1. Add deserialization structs:
#[derive(Deserialize, Debug)]
struct OpenAiUsage {
    #[serde(default)]
    prompt_tokens: u64,
    #[serde(default)]
    completion_tokens: u64,
    #[serde(default)]
    prompt_tokens_details: Option<PromptTokensDetails>,
}

#[derive(Deserialize, Debug)]
struct PromptTokensDetails {
    #[serde(default)]
    cached_tokens: u64,
}
  1. Add last_cache: std::sync::Mutex<Option<(u64, u64)>> field to OpenAiProvider

  2. Update new(), Clone, and Debug impls for the new field (mirror Claude pattern)

  3. Add store_cache_usage() and log_cache_usage() helper methods

Acceptance criteria

  • OpenAiUsage deserializes from OpenAI response JSON including prompt_tokens_details.cached_tokens
  • last_cache field initialized to None in constructor
  • Clone creates independent mutex with None
  • Debug output includes cache info

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions