feat: add support for login with ChatGPT #1212
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This does not implement the full Login with ChatGPT experience, but it should unblock people.
What works
codex
multitool now has alogin
subcommand, so you can runcodex login
, which should writeCODEX_HOME/auth.json
if you complete the flow successfully. The TUI will now read theOPENAI_API_KEY
fromauth.json
.auth.json
.LoginScreen
in the TUI that tells you to runcodex login
if both (1) your model provider expects to useOPENAI_API_KEY
as its env var, and (2)OPENAI_API_KEY
is not set.What does not work
LoginScreen
does not support the login flow from within the TUI. Instead, it tells you to quit, runcodex login
, and then runcodex
again.codex exec
does read fromauth.json
yet, nor does it direct the user to go through the login flow ifOPENAI_API_KEY
cannot be found.maybeRedeemCredits()
function fromget-api-key.tsx
has not been ported from TypeScript tologin_with_chatgpt.py
yet:codex/codex-cli/src/utils/get-api-key.tsx
Lines 84 to 89 in a67a67f
Implementation
Currently, the OAuth flow requires running a local webserver on
127.0.0.1:1455
. It seemed wasteful to incur the additional binary cost of a webserver dependency in the Rust CLI just to support login, so instead we implement this logic in Python, as Python has anhttp.server
module as part of its standard library. Specifically, we bundle the contents of a single Python file as a string in the Rust CLI and then use it to spawn a subprocess aspython3 -c {{SOURCE_FOR_PYTHON_SERVER}}
.As such, the most significant files in this PR are:
Now that the CLI may load
OPENAI_API_KEY
from the environment orCODEX_HOME/auth.json
, we need a new abstraction for reading/writing this variable, so we introduce:Note that
std::env::set_var()
is [rightfully]unsafe
in Rust 2024, so we use aLazyLock<RwLock<Option<String>>>
to storeOPENAI_API_KEY
so it is read in a thread-safe manner.Ultimately, it should be possible to go through the entire login flow from the TUI. This PR introduces a placeholder
LoginScreen
UI for that right now, though the newcodex login
subcommand introduced in this PR should be a viable workaround until the UI is ready.Testing
Because the login flow is currently implemented in a standalone Python file, you can test it without building any Rust code as follows:
For reference: