Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OLLAMA Api Base not considered #657

Closed
octera opened this issue Feb 13, 2024 · 2 comments · Fixed by #836
Closed

OLLAMA Api Base not considered #657

octera opened this issue Feb 13, 2024 · 2 comments · Fixed by #836

Comments

@octera
Copy link

octera commented Feb 13, 2024

Hello,
I'm trying to wire pr-agent with a local LLM running with OLLAMA.
I have followed the docs with settings the ollama api base in the secrets file or throught env var : None is considered by the PR Agent.
I think this came from this file pr_agent/algo/ai_handlers/litellm_ai_handler.py which is not considering this configuration : it keep the default URL : localhost:11434

I have been able to make a workaround with the huggingface configuration.

Exemple :
Not working :

CONFIG.MODEL=ollama/aRandomModel 
OLLAMA.API_BASE=http://ollama-service:11434

Working :

CONFIG.MODEL=ollama/aRandomModel 
HUGGINGFACE.KEY="dummy" 
HUGGINGFACE.API_BASE=http://ollama-service:11434 
@hussam789
Copy link
Collaborator

/help

@gregoryboue
Copy link
Contributor

Hi,

@octera your workaround is working until v0.12.
It's not working anymore since this commit with the following conditions :

if get_settings().get("HUGGINGFACE.API_BASE", None) and 'huggingface' in get_settings().config.model

So if there isn't 'huggingface' in the model name, it won't set api_base configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants
@octera @hussam789 @gregoryboue and others