Skip to content

Comments

Remove unnecessary prompt caching from example message#26

Closed
enyst wants to merge 1 commit intomainfrom
enyst/prompt-caching
Closed

Remove unnecessary prompt caching from example message#26
enyst wants to merge 1 commit intomainfrom
enyst/prompt-caching

Conversation

@enyst
Copy link
Owner

@enyst enyst commented Dec 13, 2024

The example message is not used with Anthropic models, which are the only models that support prompt caching. Therefore, setting cache_prompt on the example message was unnecessary.

This change removes the cache_prompt parameter from the example message, making the caching points clearer and more accurate:

  1. One cache point for the system message (at start)
  2. Three more cache points from the reversed messages loop for user/tool messages

This gives us exactly 4 cache points total, which is what we want for Anthropic models.

The example message is not used with Anthropic models, which are the only models
that support prompt caching. Therefore, setting cache_prompt on the example
message was unnecessary.
@github-actions
Copy link

This PR is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.

@github-actions github-actions bot added the Stale label Jan 13, 2025
@github-actions
Copy link

This PR was closed because it has been stalled for over 30 days with no activity.

@github-actions github-actions bot closed this Jan 21, 2025
@coderabbitai coderabbitai bot mentioned this pull request Sep 2, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants