Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Incorrect parsing of new bedrock model names #6644

Closed
bhushitagarwal-circle opened this issue Nov 7, 2024 · 6 comments
Closed

[Bug]: Incorrect parsing of new bedrock model names #6644

bhushitagarwal-circle opened this issue Nov 7, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@bhushitagarwal-circle
Copy link

What happened?

The new names for models released on bedrock have region prepended to them, example, us.anthropic.claude-3-5-sonnet-20241022-v2:0 for new 3.5 sonnet, us.anthropic.claude-3-5-haiku-20241022-v1:0 for new 3.5 haiku, us.meta.llama3-2-90b-instruct-v1:0 for meta 3.2 vision instruct.

This breaks the parsing of provider in the bedrock logic as it tries to separate the model name by . and fetch the first element.

Relevant log output

Error during streaming response: litellm.NotFoundError: BedrockException - Bedrock HTTPX: Unknown provider=us, model=us.meta.llama3-2-90b-instruct-v1:0

Twitter / LinkedIn details

No response

@bhushitagarwal-circle
Copy link
Author

This was resolved in the recent versions

@stephaneminisini
Copy link

I'm still seeing the issue for us.anthropic.claude-3-5-haiku-20241022-v1:0. The other model us.anthropic.claude-3-5-sonnet-20241022-v2:0 or us.anthropic.claude-3-5-sonnet-20240620-v1:0 work fine though.
{ "aws_region_name": "us-east-1", "model": "bedrock/us.anthropic.claude-3-5-haiku-20241022-v1:0", "cache": { "no-cache": true }, "error": "litellm.NotFoundError: BedrockException - Bedrock HTTPX: Unknown provider=us, model=us.anthropic.claude-3-5-haiku-20241022-v1:0\nHave you set 'mode' - https://docs.litellm.ai/docs/proxy/health#embedding-models\nstack trace: Traceback (most recent call last):\n File \"/usr/local/lib/python3.11/site-packages/litellm/main.py\", line 2589, in completion\n response = bedrock_chat_completion.completion(\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n File \"/usr/local/lib/python3.11/site-packages/litellm/llms/bedrock/chat/invoke_handler.py\", line 811, in completion\n raise BedrockError(\nlitellm.llms.bedrock.common_utils.BedrockError: Bedrock HTTPX: Unknown provider=us, model=us.anthropic.claude-3-5-haiku-20241022-v1:0\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/usr/local/lib/python3.11/site-packages/litellm/main.py\", line 5201, in ahealth_check\n await acompletion(**model_params)\n File \"/usr/local/lib/python3.11/site-packages/litellm/utils.py\", line 1228, in wrapper_async\n raise e\n File \"/usr/local/lib/python3.11/site-packages/litellm/utils.py\", line 1084, in wrapper_async\n result = await original_function(*args, **kwargs)\n " },

@brycedrennan
Copy link

brycedrennan commented Nov 18, 2024

Yes still happening for me as well on the most recent version of litellm (1.52.10)

Actually i confused the client version with the proxy image. Updating the image to 1.52.10 fixed the issue.

litellm-1             |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1             |   File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 755, in wrapper
litellm-1             |     result = original_function(*args, **kwargs)
litellm-1             |              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
litellm-1             |   File "/usr/local/lib/python3.11/site-packages/litellm/main.py", line 2850, in completion
litellm-1             |     raise exception_type(
litellm-1             |           ^^^^^^^^^^^^^^^
litellm-1             |   File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 8207, in exception_type
litellm-1             |     raise e
litellm-1             |   File "/usr/local/lib/python3.11/site-packages/litellm/utils.py", line 7019, in exception_type
litellm-1             |     raise NotFoundError(
litellm-1             | litellm.exceptions.NotFoundError: litellm.NotFoundError: BedrockException - Bedrock HTTPX: Unknown provider=us, model=us.anthropic.claude-3-5-haiku-20241022-v1:0
litellm-1             | Received Model Group=bedrock-claude-3-5-haiku-20241022
litellm-1             | Available Model Group Fallbacks=None
litellm-1             | 22:34:45 - LiteLLM Proxy:ERROR: _common.py:120 - Giving up chat_completion(...) after 1 tries (litellm.proxy._types.ProxyException)

@odellus
Copy link

odellus commented Dec 17, 2024

I'm getting [Errno -2] Name or service not known for model: bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0

@odellus
Copy link

odellus commented Dec 17, 2024

Works fine with completion in python sdk

@ishaan-jaff
Copy link
Contributor

hi @bhushitagarwal-circle  - curious do you use LiteLLM Proxy in production or are you evaluating it ?

If yes, we'd love to hop on a call to get your feedback on how we can improve. Sharing our calendly for your convenience https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants