Replies: 1 comment
-
Tabby supports connect to a model backend through API - see https://tabby.tabbyml.com/docs/administration/model/ for configuration examples |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
quick start only Provides commands to start large models at the same time, I want to simply start a service and use the openai interface provided by vllm or ollama, which is the more common way to do it, the model platform and the service application should be separated.please replenishment a programme,thanks a lot.
Beta Was this translation helpful? Give feedback.
All reactions