-
Notifications
You must be signed in to change notification settings - Fork 1
Closed
Labels
enhancementNew feature or requestNew feature or request
Description
Parent: #348
Task
Add the necessary structs and field to OpenAiProvider for tracking prompt cache usage.
Implementation
In crates/zeph-llm/src/openai.rs:
- Add deserialization structs:
#[derive(Deserialize, Debug)]
struct OpenAiUsage {
#[serde(default)]
prompt_tokens: u64,
#[serde(default)]
completion_tokens: u64,
#[serde(default)]
prompt_tokens_details: Option<PromptTokensDetails>,
}
#[derive(Deserialize, Debug)]
struct PromptTokensDetails {
#[serde(default)]
cached_tokens: u64,
}-
Add
last_cache: std::sync::Mutex<Option<(u64, u64)>>field toOpenAiProvider -
Update
new(),Clone, andDebugimpls for the new field (mirror Claude pattern) -
Add
store_cache_usage()andlog_cache_usage()helper methods
Acceptance criteria
OpenAiUsagedeserializes from OpenAI response JSON includingprompt_tokens_details.cached_tokenslast_cachefield initialized toNonein constructor- Clone creates independent mutex with
None - Debug output includes cache info
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request