Skip to content

Multi-threading #62

Discussion options

You must be logged in to vote

Hey, @KeenanFernandes2000 !

Yes, it is possible in v.0.1.33 of Ollama. You need to set some environmental variables. Check out this link, specifically under Experimental concurrency features.

OLLAMA_NUM_PARALLEL: Handle multiple requests simultaneously for a single model
OLLAMA_MAX_LOADED_MODELS: Load multiple models simultaneously

Example:

OLLAMA_NUM_PARALLEL=4 OLLAMA_MAX_LOADED_MODELS=4

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@KeenanFernandes2000
Comment options

Answer selected by KeenanFernandes2000
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants