Max_completion_tokens for codex cli #4767
Unanswered
jordan-carson
asked this question in
Q&A
Replies: 1 comment
-
|
What should the max completion tokens be in our chat completions payload? Is 2^13 aka 8192 a best practice, or should one configure larger for monolith repos?
Is gpt 5 codex set to 128k?
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you are subscribed to this thread.
取得 Android 版 Outlook<https://aka.ms/AAb9ysg>
…________________________________
From: Jordan Carson ***@***.***>
Sent: Sunday, October 5, 2025 8:27:00 PM
To: openai/codex ***@***.***>
Cc: Subscribed ***@***.***>
Subject: [openai/codex] Max_completion_tokens for codex cli (Discussion #4767)
Question on configuring proper setup with codex and ai powered terminals in general.
What should the max completion tokens be in our chat completions payload? Is 2^13 aka 8192 a best practice, or should one configure larger for monolith repos?
Is gpt 5 codex set to 128k?
—
Reply to this email directly, view it on GitHub<#4767>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/BWGDT2ID2BOKMEQUCAO37L33WEFBJAVCNFSM6AAAAACIKPYU4WVHI2DSMVQWIX3LMV43ERDJONRXK43TNFXW4OZYHE4DGMJTGU>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Question on configuring proper setup with codex and ai powered terminals in general.
What should the max completion tokens be in our chat completions payload? Is 2^13 aka 8192 a best practice, or should one configure larger for monolith repos?
Is gpt 5 codex set to 128k?
Beta Was this translation helpful? Give feedback.
All reactions