Skip to content

[BUG]: Auto compaction not triggering reliably with Anthropic models #6068

@ramarivera

Description

@ramarivera

Description

Auto compaction seems to have stopped working reliably with Anthropic models. Previously, when context tokens approached the model's limit, opencode would automatically trigger compaction to summarize the session. Now, we're getting 'prompt is too long' errors more frequently without auto compaction kicking in.

Steps To Reproduce

  1. Start a session with an Anthropic model (e.g., claude-sonnet-4-20250514)
  2. Have an extended conversation with multiple tool calls and file edits
  3. Continue working until context approaches the model's token limit (e.g., 131072 or 200000)
  4. Observe that instead of auto compaction running, you receive errors like:
    This model's maximum context length is 131072 tokens. However, you requested N tokens...
    
    or
    prompt is too long: X tokens > Y maximum
    

Expected Behavior

Opencode should automatically trigger session compaction BEFORE hitting the model's context limit, summarizing previous messages to free up context for continuing the conversation.

Actual Behavior

Auto compaction is not triggering automatically, leading to 'prompt too long' errors that stop the session. Manual compaction (via keybind) still works.

Relevant Log Output

# Running with verbose logging can provide more details:
OPENCODE_LOG=* opencode run <project>

Additional Context

Questions

  1. Did anything change in the auto compaction logic recently?
  2. Are there any new configuration options affecting compaction behavior?
  3. Is there any way to adjust the threshold or make compaction more aggressive?

Thanks for investigating this!

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions