Ollama Error
#1804
Replies: 1 comment
-
I am having the same issue. I have checked that Ollama is working by opening http://127.0.0.1:11434/ in my browser. It says 'Ollama is running'. Using Ollama LLM gives the (missing 1 required positional argument: 'base_url') error. No errors if I use OpenAI. I have only 1 model which is (llama3). |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am new to Langflow and I was trying to use Llama2 through Ollama as the model but I am getting the following error:
ValueError: Error building vertex Ollama: ChatOllamaComponent.build() missing 1 required positional argument: 'base_url'
The base url is default on http://localhost:11434/
Beta Was this translation helpful? Give feedback.
All reactions