Skip to content

Compaction auto not working with Codex #8293

@brandon93s

Description

@brandon93s

Description

I've this this a few times now running long sessions:

{"type":"error","sequence_number":2,"error":{"type":"invalid_request_error","code":"context_length_exceeded","message":"Your input exceeds the context window of this model. Please adjust your input and try again.","param":"input"}}

I'm using GPT-5.2 Codex via OpenAI OAuth. Ideally codex's auto-compation would work, but in the absence of that I thought the opencode auto compaction would run before the error is triggered. I've set this explicitly in my config:

  "compaction": {
    "auto": true,
    "prune": true,
  },

Plugins

none

OpenCode version

1.1.18

Steps to reproduce

No response

Screenshot and/or share link

Image

Operating System

macOS 15.6.1

Terminal

ghostty

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions