Your current environment
No need for the environment
How would you like to use vllm
I want to run inference using openwebui (or something similar) using vLLM as a backend instead of ollama.
I already launched the vLLM openai server api and open-webai. However, it is not working.
Before submitting a new issue...