Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixing vLLM image #651

Merged
merged 5 commits into from
Sep 7, 2023
Merged

Fixing vLLM image #651

merged 5 commits into from
Sep 7, 2023

Conversation

aspctu
Copy link
Collaborator

@aspctu aspctu commented Sep 7, 2023

We accidentally replaced the OpenAI-compatible vLLM server with a generic FastAPI server implementation that was missing the health check.

Testing

Fix was tested on in-cluster node.

TODO

  • Test this PR end-to-end on node

@aspctu aspctu merged commit 87058b0 into main Sep 7, 2023
@aspctu aspctu deleted the fixing-vllm-image branch September 7, 2023 23:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants