Skip to content

Commit

Permalink
fix: 503 when private gpt gets ollama service (#2104)
Browse files Browse the repository at this point in the history
When running private gpt with external ollama API, ollama service
returns 503 on startup because ollama service (traefik) might not be
ready.

- Add healthcheck to ollama service to test for connection to external
ollama
- private-gpt-ollama service depends on ollama being service_healthy

Co-authored-by: Koh Meng Hui <kohmh@duck.com>
  • Loading branch information
meng-hui and Koh Meng Hui authored Oct 17, 2024
1 parent 5851b02 commit 940bdd4
Showing 1 changed file with 8 additions and 1 deletion.
9 changes: 8 additions & 1 deletion docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@ services:
- ollama-cuda
- ollama-api
depends_on:
- ollama
ollama:
condition: service_healthy

# Private-GPT service for the local mode
# This service builds from a local Dockerfile and runs the application in local mode.
Expand Down Expand Up @@ -60,6 +61,12 @@ services:
# This will route requests to the Ollama service based on the profile.
ollama:
image: traefik:v2.10
healthcheck:
test: ["CMD", "sh", "-c", "wget -q --spider http://ollama:11434 || exit 1"]
interval: 10s
retries: 3
start_period: 5s
timeout: 5s
ports:
- "8080:8080"
command:
Expand Down

0 comments on commit 940bdd4

Please sign in to comment.