fix(api-proxy): keep container running when no API keys present#801
Conversation
The api-proxy container was exiting immediately when no API keys were configured because the HTTP servers were only started conditionally. This caused docker-compose health checks to fail and prevented containers from starting. Changed server.js to always start both HTTP servers (ports 10000 and 10001), with conditional middleware setup based on API key presence. This keeps the container running and health checks passing even when API keys are not configured. Fixes issue where test-examples workflow failed with "dependency failed to start: container awf-api-proxy exited (0)" Co-authored-by: lpcox <15877973+lpcox@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
This PR fixes the api-proxy sidecar container exiting immediately when no LLM API keys are provided by ensuring the HTTP servers always bind to ports 10000/10001 while only enabling proxy middleware when the corresponding key exists. This keeps docker-compose health checks and depends_on: condition: service_healthy working in workflows/tests that don’t use LLM APIs.
Changes:
- Always start the OpenAI listener on port 10000 and Anthropic listener on port 10001.
- Conditionally attach
http-proxy-middlewareonly whenOPENAI_API_KEY/ANTHROPIC_API_KEYare present. - Add logs indicating whether each proxy is configured or disabled.
Comments suppressed due to low confidence (1)
containers/api-proxy/server.js:111
- When ANTHROPIC_API_KEY is not set, no proxy middleware is attached, but the startup log still says "Anthropic proxy listening on port 10001". Consider updating this log message (or making it conditional) so it’s clear the server is up to satisfy health/dependency checks but the proxy functionality is disabled.
anthropicApp.listen(10001, '0.0.0.0', () => {
console.log('[API Proxy] Anthropic proxy listening on port 10001');
if (ANTHROPIC_API_KEY) {
console.log('[API Proxy] Routing through Squid to api.anthropic.com');
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| console.log('[API Proxy] OpenAI proxy listening on port 10000'); | ||
| if (OPENAI_API_KEY) { | ||
| console.log('[API Proxy] Routing through Squid to api.openai.com'); |
There was a problem hiding this comment.
When OPENAI_API_KEY is not set, no proxy middleware is attached, but the startup log still says "OpenAI proxy listening on port 10000". Consider changing this message to reflect that the server is listening but the proxy is disabled, or make the message conditional so logs don’t imply the proxy is active when it isn’t.
This issue also appears on line 108 of the same file.
| console.log('[API Proxy] OpenAI proxy listening on port 10000'); | |
| if (OPENAI_API_KEY) { | |
| console.log('[API Proxy] Routing through Squid to api.openai.com'); | |
| if (OPENAI_API_KEY) { | |
| console.log('[API Proxy] OpenAI proxy listening on port 10000'); | |
| console.log('[API Proxy] Routing through Squid to api.openai.com'); | |
| } else { | |
| console.log('[API Proxy] OpenAI proxy server listening on port 10000 (proxy disabled - no OPENAI_API_KEY configured)'); |
The api-proxy container was exiting immediately (exit code 0) when no API keys were configured, causing docker-compose health checks to fail with "dependency failed to start: container awf-api-proxy exited (0)". This broke the test-examples workflow which doesn't use LLM APIs.
Changes
containers/api-proxy/server.js: HTTP servers on ports 10000 and 10001 now always start, regardless of API key presenceBefore
After
This ensures the container remains healthy for docker-compose dependencies even when API keys aren't needed (e.g., in tests that don't use LLM APIs).