diff --git a/AGENTS.md b/AGENTS.md index acd88bf7..d9829df0 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -128,6 +128,19 @@ Workflow: `.github/workflows/sync-agent-sdk-openapi.yml` - Use Mintlify components (``, ``, ``, etc.) where appropriate. - When linking internally, prefer **absolute** doc paths (e.g. `/overview/quickstart`). +## LLM API Key Options + +The SDK documentation maintains three ways for users to obtain LLM access: + +1. **Direct Provider** - Bring your own API key from providers like Anthropic, OpenAI, etc. +2. **OpenHands Cloud** - Use OpenHands Cloud API keys (recommended for verified models) +3. **Third-party Subscription Login** - Authenticate with existing subscriptions (e.g., ChatGPT Plus/Pro via `LLM.subscription_login()`) + +When documenting LLM setup or examples, ensure all three options are mentioned where appropriate: +- `sdk/getting-started.mdx` - Main getting started page with AccordionGroup +- `sdk/shared-snippets/how-to-run-example.mdx` - Shared snippet for running examples +- `sdk/guides/llm-subscriptions.mdx` - Dedicated guide for subscription login + ## Validation ### LLM pricing table validation diff --git a/docs.json b/docs.json index 977858db..05a29914 100644 --- a/docs.json +++ b/docs.json @@ -284,6 +284,7 @@ { "group": "LLM Features", "pages": [ + "sdk/guides/llm-subscriptions", "sdk/guides/llm-registry", "sdk/guides/llm-routing", "sdk/guides/llm-reasoning", diff --git a/sdk/getting-started.mdx b/sdk/getting-started.mdx index 3a0dd26a..0165ee29 100644 --- a/sdk/getting-started.mdx +++ b/sdk/getting-started.mdx @@ -45,6 +45,18 @@ The SDK requires an LLM API key from any [LiteLLM-supported provider](https://do [Learn more →](/openhands/usage/llms/openhands-llms) + + + If you have a ChatGPT Plus or Pro subscription, you can use `LLM.subscription_login()` to authenticate with your ChatGPT account and access Codex models without consuming API credits. + + ```python + from openhands.sdk import LLM + + llm = LLM.subscription_login(vendor="openai", model="gpt-5.2-codex") + ``` + + [Learn more →](/sdk/guides/llm-subscriptions) + > Tip: Model name prefixes depend on your provider diff --git a/sdk/guides/llm-subscriptions.mdx b/sdk/guides/llm-subscriptions.mdx new file mode 100644 index 00000000..524d6e71 --- /dev/null +++ b/sdk/guides/llm-subscriptions.mdx @@ -0,0 +1,165 @@ +--- +title: LLM Subscriptions +description: Use your ChatGPT Plus/Pro subscription to access Codex models without consuming API credits. +--- + +import RunExampleCode from "/sdk/shared-snippets/how-to-run-example.mdx"; + + +OpenAI subscription is the first provider we support. More subscription providers will be added in future releases. + + +> A ready-to-run example is available [here](#ready-to-run-example)! + +Use your existing ChatGPT Plus or Pro subscription to access OpenAI's Codex models without consuming API credits. The SDK handles OAuth authentication, credential caching, and automatic token refresh. + +## How It Works + + + + ### Call subscription_login() + + The `LLM.subscription_login()` class method handles the entire authentication flow: + + ```python icon="python" + from openhands.sdk import LLM + + llm = LLM.subscription_login(vendor="openai", model="gpt-5.2-codex") + ``` + + On first run, this opens your browser for OAuth authentication with OpenAI. After successful login, credentials are cached locally in `~/.openhands/auth/` for future use. + + + ### Use the LLM + + Once authenticated, use the LLM with your agent as usual. The SDK automatically refreshes tokens when they expire. + + + +## Supported Models + +The following models are available via ChatGPT subscription: + +| Model | Description | +|-------|-------------| +| `gpt-5.2-codex` | Latest Codex model (default) | +| `gpt-5.2` | GPT-5.2 base model | +| `gpt-5.1-codex-max` | High-capacity Codex model | +| `gpt-5.1-codex-mini` | Lightweight Codex model | + +## Configuration Options + +### Force Fresh Login + +If your cached credentials become stale or you want to switch accounts: + +```python icon="python" +llm = LLM.subscription_login( + vendor="openai", + model="gpt-5.2-codex", + force_login=True, # Always perform fresh OAuth login +) +``` + +### Disable Browser Auto-Open + +For headless environments or when you prefer to manually open the URL: + +```python icon="python" +llm = LLM.subscription_login( + vendor="openai", + model="gpt-5.2-codex", + open_browser=False, # Prints URL to console instead +) +``` + +### Check Subscription Mode + +Verify that the LLM is using subscription-based authentication: + +```python icon="python" +llm = LLM.subscription_login(vendor="openai", model="gpt-5.2-codex") +print(f"Using subscription: {llm.is_subscription}") # True +``` + +## Credential Storage + +Credentials are stored securely in `~/.openhands/auth/`. To clear cached credentials and force a fresh login, delete the files in this directory. + +## Ready-to-run Example + + +This example is available on GitHub: [examples/01_standalone_sdk/35_subscription_login.py](https://github.com/OpenHands/software-agent-sdk/blob/main/examples/01_standalone_sdk/35_subscription_login.py) + + +```python icon="python" expandable examples/01_standalone_sdk/35_subscription_login.py +"""Example: Using ChatGPT subscription for Codex models. + +This example demonstrates how to use your ChatGPT Plus/Pro subscription +to access OpenAI's Codex models without consuming API credits. + +The subscription_login() method handles: +- OAuth PKCE authentication flow +- Credential caching (~/.openhands/auth/) +- Automatic token refresh + +Supported models: +- gpt-5.2-codex +- gpt-5.2 +- gpt-5.1-codex-max +- gpt-5.1-codex-mini + +Requirements: +- Active ChatGPT Plus or Pro subscription +- Browser access for initial OAuth login +""" + +import os + +from openhands.sdk import LLM, Agent, Conversation, Tool +from openhands.tools.file_editor import FileEditorTool +from openhands.tools.terminal import TerminalTool + + +# First time: Opens browser for OAuth login +# Subsequent calls: Reuses cached credentials (auto-refreshes if expired) +llm = LLM.subscription_login( + vendor="openai", + model="gpt-5.2-codex", # or "gpt-5.2", "gpt-5.1-codex-max", "gpt-5.1-codex-mini" +) + +# Alternative: Force a fresh login (useful if credentials are stale) +# llm = LLM.subscription_login(vendor="openai", model="gpt-5.2-codex", force_login=True) + +# Alternative: Disable auto-opening browser (prints URL to console instead) +# llm = LLM.subscription_login( +# vendor="openai", model="gpt-5.2-codex", open_browser=False +# ) + +# Verify subscription mode is active +print(f"Using subscription mode: {llm.is_subscription}") + +# Use the LLM with an agent as usual +agent = Agent( + llm=llm, + tools=[ + Tool(name=TerminalTool.name), + Tool(name=FileEditorTool.name), + ], +) + +cwd = os.getcwd() +conversation = Conversation(agent=agent, workspace=cwd) + +conversation.send_message("List the files in the current directory.") +conversation.run() +print("Done!") +``` + + + +## Next Steps + +- **[LLM Registry](/sdk/guides/llm-registry)** - Manage multiple LLM configurations +- **[LLM Streaming](/sdk/guides/llm-streaming)** - Stream responses token-by-token +- **[LLM Reasoning](/sdk/guides/llm-reasoning)** - Access model reasoning traces diff --git a/sdk/shared-snippets/how-to-run-example.mdx b/sdk/shared-snippets/how-to-run-example.mdx index 8da0a874..b9f92a1b 100644 --- a/sdk/shared-snippets/how-to-run-example.mdx +++ b/sdk/shared-snippets/how-to-run-example.mdx @@ -1,5 +1,5 @@ {/* - This snippet describe on how to run an example using your own API-key or the Openhands Cloud + This snippet describe on how to run an example using your own API-key, the Openhands Cloud, or ChatGPT subscription */} You can run the example code as-is. @@ -15,4 +15,8 @@ You can run the example code as-is. {`# https://app.all-hands.dev/settings/api-keys\nexport LLM_API_KEY="your-openhands-api-key"\nexport LLM_MODEL="openhands/claude-sonnet-4-5-20250929"\ncd software-agent-sdk\nuv run python ${path_to_script}`} - \ No newline at end of file + + + + **ChatGPT Plus/Pro subscribers**: You can use `LLM.subscription_login()` to authenticate with your ChatGPT account and access Codex models without consuming API credits. See the [LLM Subscriptions guide](/sdk/guides/llm-subscriptions) for details. + \ No newline at end of file