You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
I'm trying to wire pr-agent with a local LLM running with OLLAMA.
I have followed the docs with settings the ollama api base in the secrets file or throught env var : None is considered by the PR Agent.
I think this came from this file pr_agent/algo/ai_handlers/litellm_ai_handler.py which is not considering this configuration : it keep the default URL : localhost:11434
I have been able to make a workaround with the huggingface configuration.
Hello,
I'm trying to wire pr-agent with a local LLM running with OLLAMA.
I have followed the docs with settings the ollama api base in the secrets file or throught env var : None is considered by the PR Agent.
I think this came from this file pr_agent/algo/ai_handlers/litellm_ai_handler.py which is not considering this configuration : it keep the default URL : localhost:11434
I have been able to make a workaround with the huggingface configuration.
Exemple :
Not working :
Working :
The text was updated successfully, but these errors were encountered: