Skip to content

fix(api-proxy): keep container running when no API keys present#801

Merged
lpcox merged 2 commits intoclaude/enable-api-proxy-by-defaultfrom
claude/fix-github-actions-workflow-again
Feb 13, 2026
Merged

fix(api-proxy): keep container running when no API keys present#801
lpcox merged 2 commits intoclaude/enable-api-proxy-by-defaultfrom
claude/fix-github-actions-workflow-again

Conversation

@Claude
Copy link
Contributor

@Claude Claude AI commented Feb 13, 2026

The api-proxy container was exiting immediately (exit code 0) when no API keys were configured, causing docker-compose health checks to fail with "dependency failed to start: container awf-api-proxy exited (0)". This broke the test-examples workflow which doesn't use LLM APIs.

Changes

  • Modified containers/api-proxy/server.js: HTTP servers on ports 10000 and 10001 now always start, regardless of API key presence
  • Conditional middleware: Proxy middleware is only configured when corresponding API keys exist
  • Informative logging: Added console messages indicating whether each proxy is enabled or disabled

Before

// Server only started if API key present
if (OPENAI_API_KEY) {
  app.use(createProxyMiddleware({...}));
  app.listen(10000, ...);  // Container exits if key missing
}

After

// Middleware conditionally configured
if (OPENAI_API_KEY) {
  app.use(createProxyMiddleware({...}));
} else {
  console.log('[API Proxy] OpenAI API key not configured - proxy disabled');
}
// Server always starts to keep container alive
app.listen(10000, '0.0.0.0', ...);

This ensures the container remains healthy for docker-compose dependencies even when API keys aren't needed (e.g., in tests that don't use LLM APIs).

The api-proxy container was exiting immediately when no API keys were
configured because the HTTP servers were only started conditionally.
This caused docker-compose health checks to fail and prevented
containers from starting.

Changed server.js to always start both HTTP servers (ports 10000 and
10001), with conditional middleware setup based on API key presence.
This keeps the container running and health checks passing even when
API keys are not configured.

Fixes issue where test-examples workflow failed with "dependency
failed to start: container awf-api-proxy exited (0)"

Co-authored-by: lpcox <15877973+lpcox@users.noreply.github.com>
@Claude Claude AI changed the title [WIP] Fix failing GitHub Actions workflow Test Examples fix(api-proxy): keep container running when no API keys present Feb 13, 2026
@Claude Claude AI requested a review from lpcox February 13, 2026 04:26
@lpcox lpcox marked this pull request as ready for review February 13, 2026 04:27
Copilot AI review requested due to automatic review settings February 13, 2026 04:27
@lpcox lpcox merged commit caccd77 into claude/enable-api-proxy-by-default Feb 13, 2026
2 checks passed
@lpcox lpcox deleted the claude/fix-github-actions-workflow-again branch February 13, 2026 04:28
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes the api-proxy sidecar container exiting immediately when no LLM API keys are provided by ensuring the HTTP servers always bind to ports 10000/10001 while only enabling proxy middleware when the corresponding key exists. This keeps docker-compose health checks and depends_on: condition: service_healthy working in workflows/tests that don’t use LLM APIs.

Changes:

  • Always start the OpenAI listener on port 10000 and Anthropic listener on port 10001.
  • Conditionally attach http-proxy-middleware only when OPENAI_API_KEY / ANTHROPIC_API_KEY are present.
  • Add logs indicating whether each proxy is configured or disabled.
Comments suppressed due to low confidence (1)

containers/api-proxy/server.js:111

  • When ANTHROPIC_API_KEY is not set, no proxy middleware is attached, but the startup log still says "Anthropic proxy listening on port 10001". Consider updating this log message (or making it conditional) so it’s clear the server is up to satisfy health/dependency checks but the proxy functionality is disabled.
anthropicApp.listen(10001, '0.0.0.0', () => {
  console.log('[API Proxy] Anthropic proxy listening on port 10001');
  if (ANTHROPIC_API_KEY) {
    console.log('[API Proxy] Routing through Squid to api.anthropic.com');

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +73 to 75
console.log('[API Proxy] OpenAI proxy listening on port 10000');
if (OPENAI_API_KEY) {
console.log('[API Proxy] Routing through Squid to api.openai.com');
Copy link

Copilot AI Feb 13, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When OPENAI_API_KEY is not set, no proxy middleware is attached, but the startup log still says "OpenAI proxy listening on port 10000". Consider changing this message to reflect that the server is listening but the proxy is disabled, or make the message conditional so logs don’t imply the proxy is active when it isn’t.

This issue also appears on line 108 of the same file.

Suggested change
console.log('[API Proxy] OpenAI proxy listening on port 10000');
if (OPENAI_API_KEY) {
console.log('[API Proxy] Routing through Squid to api.openai.com');
if (OPENAI_API_KEY) {
console.log('[API Proxy] OpenAI proxy listening on port 10000');
console.log('[API Proxy] Routing through Squid to api.openai.com');
} else {
console.log('[API Proxy] OpenAI proxy server listening on port 10000 (proxy disabled - no OPENAI_API_KEY configured)');

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants