-
Notifications
You must be signed in to change notification settings - Fork 353
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for AWS Bedrock #118
Comments
PR welcome to add AWS Bedrock. I think from a brief search that Bedrock is like VertexAI - you use a custom API, but the format of the response matches the foundational model used, so actually using Bedrock with (say) Anthropic is more about customising the URL and auth, then using the (as yet unimplemented, see #63) Anthropic logic. That's how The |
@samuelcolvin - a couple of points to add... The Anthropic SDK currently supports Amazon Bedrock though the InvokeModel API with a HTTPX client and boto3 for auth. Amazon Bedrock also has a newer Converse API that provides a consistent interface for all the models it supports. This includes the new Amazon Nova models as well as models from Meta, Mistral and Cohere models, the full list is here. Maybe it makes sense to add AnthropicBedrock support in the Anthropic PR (since I see you're using the anthropic python SDK) and then handle the broader Bedrock support separately with the Converse API? |
Hi, I can help with this issue |
Any help needed ? I can work on it. @PabloSimonSainz or @samuelcolvin |
Following up on what @stephenhibbert said, currently the I agree that it would be nice to have built-in support for any model on Bedrock, but it's still a valid workaround for those who are using Anthropic models :) Although it would be nice to have the correct type hint to take both clients into account. |
Spot on @dmenini. As a workaround we can install
But it would be nice to have 1) documentation for this in the Anthropic section 2) correct type hints for Anthropic Bedrock models 3) full Bedrock support for any model using the ConverseAPI. Happy to help work on/review/test this too. This pydantic/logfire#701 should also mean the logfire integration will work out the box when merged. |
Hey, |
Hey guys !!! |
We will support Claude via bedrock, as it should just be setting the url and configuring auth. Not sure about other models. PR welcome to try to implement it, otherwise we'll try to get it it soon. |
This repo maps the OpenAI API to Bedrock API. Maybe useful as another starting point. https://github.com/aws-samples/bedrock-access-gateway/tree/main |
Using Anthropic Bedrock does not work with llama 3.3 on AWS Bedrock.
|
To call models through aws bedrock, some models can be invoked through invoke_model and others with converse (for example anthropic/sonnet can only be called with converse). As a note, another small consideration is the availability of the model in the region. models-regions |
Regarding this issue, would there be any problem if it were addressed directly with the python aws SDK (boto3) with just the right dependencies? |
That is incorrect. In fact, at the time of writing, the anthropic python SDK currently exclusively uses the At the moment there is a valid workaround for using Anthropic models on Bedrock in my comment above. The bigger unlock is implementing wider support for Bedrock models (Llama, Nova, Anthropic, Mistral etc) which I would suggest be implemented by using the |
Totally agree with this statement. The |
I had that problem recently on a project with AWS Bedrock and had to use the converse API instead of the invoke API for anthropic models. |
@PabloSimonSainz It's hard to tell exactly what's wrong with your code without seeing the source but that error message suggests you're using the old style Anthropic Text Completions instead of the new Messages format. If you use the Messages API correctly with the https://docs.anthropic.com/en/api/migrating-from-text-completions-to-messages |
Found this project https://github.com/aws-samples/bedrock-access-gateway which using AWS from openai import OpenAI
client = OpenAI(
api_key="...",
base_url="...",
)
from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel
model = OpenAIModel('us.amazon.nova-lite-v1:0', openai_client=client)
agent = Agent(model) A few minor issues still, and I have a fork of my own https://github.com/Wh1isper/bedway |
It would be nice to have support for Bedrock so that application that use AWS can integrate more seamlessly.
The text was updated successfully, but these errors were encountered: