-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: AWS Bedrock Nova error #7181
Labels
bug
Something isn't working
Comments
Try replacing model name to 'us.amazon.nova-pro-v1:0' |
The Amazon model (TMK) is different than Anthropic and Llama in that the modelId does not use the
Here is the model API guidance from AWS Console:
The above was ran on litellm version 1.56.4. I tried with the regular
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
What happened?
LiteLLM proxy setup with a few models:
Other models work, but Nova is not.
Error is the same with python litellm script as well as using open-webui.
Not sure why the provider is different. I added them the sameway using the Add Model UI.
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.54.1
Twitter / LinkedIn details
No response
The text was updated successfully, but these errors were encountered: