-
Notifications
You must be signed in to change notification settings - Fork 173
[bug] anthropic model strings in PREFERRED_MODELS not mapping to litellm providers #325
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking βSign up for GitHubβ, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
adubatl
wants to merge
12
commits into
AgentOps-AI:main
Choose a base branch
from
adubatl:adubatl/claude_model_issue
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+87
β10
Open
Changes from all commits
Commits
Show all changes
12 commits
Select commit
Hold shift + click to select a range
bb9eb45
prevent model not found error after using da wizard
adubatl 536219e
minor copy updates
adubatl dbb047d
Merge branch 'main' into adubatl/claude_model_issue
bboynton97 ad7371b
undo auto-formatting of .md file
adubatl aeaa4b1
Merge branch 'main' into adubatl/claude_model_issue
adubatl 06c42a9
fix the anthropic models and add a test
adubatl 13f1b20
slightly better console messaging
adubatl 3dc99b5
Merge branch 'main' into adubatl/claude_model_issue
adubatl 01f6616
i wonder when I installed litellm in my venv manually 0,0
adubatl 11d51e5
fix my flu-induced inverted conditional and add a couple logs
adubatl c511bc6
Merge branch 'main' into adubatl/claude_model_issue
adubatl ec53519
Merge branch 'main' into adubatl/claude_model_issue
adubatl File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,78 @@ | ||
from importlib.util import find_spec | ||
import pytest | ||
import subprocess | ||
import sys | ||
from difflib import get_close_matches | ||
from agentstack.cli.cli import PREFERRED_MODELS | ||
|
||
|
||
@pytest.fixture(scope="session", autouse=True) | ||
def install_litellm(): | ||
"""Install litellm if not already installed.""" | ||
print("\nChecking for litellm installation...") | ||
spec = find_spec("litellm") | ||
if spec is None: | ||
print("litellm not found, installing...") | ||
subprocess.check_call([sys.executable, "-m", "pip", "install", "litellm"]) | ||
print("litellm installation complete") | ||
|
||
|
||
def clean_model_name(provider: str, model: str) -> str: | ||
""" | ||
Clean up model name by removing duplicate provider strings. | ||
Seems like in litellm groq and deepseek have the provider twice in the model name. | ||
""" | ||
if model.startswith(f"{provider}/"): | ||
return f"{provider}/{model[len(provider) + 1 :]}" | ||
return f"{provider}/{model}" | ||
|
||
|
||
def find_similar_models(model: str, all_models: set, num_suggestions: int = 3) -> list[str]: | ||
""" | ||
Find similar model names using string matching. | ||
If the test fails, now you can see the ideal model to replace a broken one with. | ||
""" | ||
try: | ||
provider, model_name = model.split('/') | ||
except ValueError: | ||
return get_close_matches(model, all_models, n=num_suggestions, cutoff=0.3) | ||
|
||
provider_models = [m for m in all_models if m.startswith(f"{provider}/")] | ||
if provider_models: | ||
matches = get_close_matches(model, provider_models, n=num_suggestions, cutoff=0.3) | ||
if matches: | ||
return matches | ||
return get_close_matches(model, all_models, n=num_suggestions, cutoff=0.3) | ||
|
||
|
||
def test_preferred_models_validity(): | ||
"""Test that all PREFERRED_MODELS are valid LiteLLM models.""" | ||
from litellm import models_by_provider | ||
|
||
all_litellm_models = set() | ||
for provider, models in models_by_provider.items(): | ||
for model in models: | ||
full_model_name = clean_model_name(provider, model) | ||
all_litellm_models.add(full_model_name) | ||
|
||
invalid_models_with_suggestions = {} | ||
for model in PREFERRED_MODELS: | ||
if model not in all_litellm_models: | ||
suggestions = find_similar_models(model, all_litellm_models) | ||
invalid_models_with_suggestions[model] = suggestions | ||
|
||
if invalid_models_with_suggestions: | ||
error_message = ( | ||
"The following models are not in LiteLLM's supported models:\n" | ||
"\nFor a complete list of supported models, visit: https://docs.litellm.ai/docs/providers\n" | ||
) | ||
for model, suggestions in invalid_models_with_suggestions.items(): | ||
error_message += f"\n- {model}" | ||
if suggestions: | ||
error_message += "\n Similar available models:" | ||
for suggestion in suggestions: | ||
error_message += f"\n * {suggestion}" | ||
else: | ||
error_message += "\n No similar models found." | ||
|
||
assert False, error_message |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Uh oh!
There was an error while loading. Please reload this page.