Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use a different model to guess pip and respond #80

Merged
merged 2 commits into from
Feb 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions src/rawdog/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
"llm_api_key": None,
"llm_base_url": None,
"llm_model": "gpt-4-turbo-preview",
"pip_model": None,
"llm_custom_provider": None,
"llm_temperature": 1.0,
"retries": 2,
Expand All @@ -21,6 +22,7 @@
setting_descriptions = {
"retries": "If the script fails, retry this many times before giving up.",
"leash": "Print the script before executing and prompt for confirmation.",
"pip_model": "The model to use to get package name from import name.",
}


Expand Down
9 changes: 8 additions & 1 deletion src/rawdog/llm_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,14 @@ def add_message(self, role: str, content: str):

def get_python_package(self, import_name: str):
base_url = self.config.get("llm_base_url")
model = self.config.get("llm_model")
model = self.config.get("pip_model")
llm_model = self.config.get("llm_model")
if model is None:
if "ft:" in llm_model or "rawdog" in llm_model or "abante" in llm_model:
model = "gpt-3.5-turbo"
else:
model = llm_model

custom_llm_provider = self.config.get("llm_custom_provider")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What if the user is using a different llm provider (like ollama)? It would be kind of annoying to have to change the pip model as well. But I guess that's a very niche scenario so it probably doesn't matter.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should only be an issue if they are using ollama with a rawdog fine tuned model. Otherwise they'll just get the same model for both llm_model and pip_model which is what they would want.


messages = [
Expand Down
Loading