-
Notifications
You must be signed in to change notification settings - Fork 222
Refactor API proxy to use supportsLLMGateway engine flag #15557
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -83,7 +83,7 @@ func collectDockerImages(tools map[string]any, workflowData *WorkflowData, actio | |||||||||
| } | ||||||||||
|
|
||||||||||
| // Collect AWF (firewall) container images when firewall is enabled | ||||||||||
| // AWF uses three containers: squid (proxy), agent, and api-proxy (for Claude/Codex) | ||||||||||
| // AWF uses three containers: squid (proxy), agent, and api-proxy (for engines with LLM gateway support) | ||||||||||
| if isFirewallEnabled(workflowData) { | ||||||||||
| // Get the firewall version for image tags | ||||||||||
| firewallConfig := getFirewallConfig(workflowData) | ||||||||||
|
|
@@ -105,15 +105,21 @@ func collectDockerImages(tools map[string]any, workflowData *WorkflowData, actio | |||||||||
| dockerLog.Printf("Added AWF agent container: %s", agentImage) | ||||||||||
| } | ||||||||||
|
|
||||||||||
| // Add api-proxy sidecar container for engines that use --enable-api-proxy | ||||||||||
| // Add api-proxy sidecar container for engines that support LLM gateway | ||||||||||
| // The api-proxy holds LLM API keys securely and proxies requests through Squid: | ||||||||||
| // - Port 10000: OpenAI API proxy (for Codex) | ||||||||||
| // - Port 10001: Anthropic API proxy (for Claude) | ||||||||||
|
Comment on lines
+110
to
111
|
||||||||||
| // - Port 10000: OpenAI API proxy (for Codex) | |
| // - Port 10001: Anthropic API proxy (for Claude) | |
| // - Port 10000: OpenAI API proxy (reserved for future use; Codex does not currently use the LLM gateway) | |
| // - Port 10001: Anthropic API proxy (for Claude; currently the only engine using the LLM gateway) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The conditional check
if e.SupportsLLMGateway()will always be false for Codex sincesupportsLLMGatewayis set tofalseat line 44. This means the--enable-api-proxyflag will never be added for Codex. While this appears intentional based on the configuration change, this entire conditional block (lines 250-255) is now unreachable dead code for the Codex engine and could be removed for clarity. Alternatively, if Codex is intended to support LLM gateway in the future, the configuration at line 44 should be changed totrue.This issue also appears on line 44 of the same file.