Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for AWS Bedrock #118

Open
abtawfik opened this issue Dec 2, 2024 · 18 comments
Open

Support for AWS Bedrock #118

abtawfik opened this issue Dec 2, 2024 · 18 comments
Labels
new models Support for new model(s)

Comments

@abtawfik
Copy link

abtawfik commented Dec 2, 2024

It would be nice to have support for Bedrock so that application that use AWS can integrate more seamlessly.

@samuelcolvin
Copy link
Member

PR welcome to add AWS Bedrock.

I think from a brief search that Bedrock is like VertexAI - you use a custom API, but the format of the response matches the foundational model used, so actually using Bedrock with (say) Anthropic is more about customising the URL and auth, then using the (as yet unimplemented, see #63) Anthropic logic.

That's how The VertexAIModel works - it does it's own auth and sets the url, then just returns an instance of GeminiAgentModel.

@sydney-runkle sydney-runkle added the new models Support for new model(s) label Dec 5, 2024
@stephenhibbert
Copy link

@samuelcolvin - a couple of points to add...

The Anthropic SDK currently supports Amazon Bedrock though the InvokeModel API with a HTTPX client and boto3 for auth.

Amazon Bedrock also has a newer Converse API that provides a consistent interface for all the models it supports. This includes the new Amazon Nova models as well as models from Meta, Mistral and Cohere models, the full list is here.

Maybe it makes sense to add AnthropicBedrock support in the Anthropic PR (since I see you're using the anthropic python SDK) and then handle the broader Bedrock support separately with the Converse API?

@PabloSimonSainz
Copy link

Hi, I can help with this issue

@YanSte
Copy link
Contributor

YanSte commented Dec 16, 2024

Any help needed ? I can work on it. @PabloSimonSainz or @samuelcolvin

@dmenini
Copy link

dmenini commented Dec 20, 2024

Following up on what @stephenhibbert said, currently the AnthropicModel class accepts the anthropic_client param with type AsyncAnthropic. However, the current logic also works with the AsyncAnthropicBedrock client from anthropic.

I agree that it would be nice to have built-in support for any model on Bedrock, but it's still a valid workaround for those who are using Anthropic models :) Although it would be nice to have the correct type hint to take both clients into account.

@stephenhibbert
Copy link

Spot on @dmenini. As a workaround we can install boto3 and pass an instance of AsyncAnthropicBedrock to the AnthropicModel like this

from anthropic import AsyncAnthropicBedrock
from pydantic_ai.models.anthropic import AnthropicModel
from pydantic_ai import Agent
from pydantic import BaseModel

anthropic_bedrock_client = AsyncAnthropicBedrock(aws_region='us-east-1')

model = AnthropicModel(
    model_name='anthropic.claude-3-haiku-20240307-v1:0',
    anthropic_client=anthropic_bedrock_client
)

class MyModel(BaseModel):
    city: str
    country: str

print(f'Using model: {model}')
agent = Agent(model, result_type=MyModel)

if __name__ == '__main__':
    result = agent.run_sync('The windy city in the US of A.')
    print(result.data)
    print(result.usage())


Using model: AnthropicModel(model_name='anthropic.claude-3-haiku-20240307-v1:0')    
city='Chicago' country='United States'
Usage(requests=1, request_tokens=459, response_tokens=56, total_tokens=515, details=None)

Process finished with exit code 0   

But it would be nice to have 1) documentation for this in the Anthropic section 2) correct type hints for Anthropic Bedrock models 3) full Bedrock support for any model using the ConverseAPI. Happy to help work on/review/test this too.

This pydantic/logfire#701 should also mean the logfire integration will work out the box when merged.

@PierreColombo
Copy link

Hey,
I'm still interrested in pydantic/logfire#701 to be merged :) this will remove some friction to adapt my current code!

@Shekswess
Copy link

Hey guys !!!
@samuelcolvin sorry for being late with this comment, do you plan to add Bedrock support ?
I'm extremely excited to try PydanticAI with Bedrock models (mostly Claude models and the new AWS Nova models)....
Can I somehow help and contribute ?

@samuelcolvin
Copy link
Member

We will support Claude via bedrock, as it should just be setting the url and configuring auth.

Not sure about other models.

PR welcome to try to implement it, otherwise we'll try to get it it soon.

@wolfgangcolsman
Copy link

wolfgangcolsman commented Jan 4, 2025

This repo maps the OpenAI API to Bedrock API. Maybe useful as another starting point.

https://github.com/aws-samples/bedrock-access-gateway/tree/main

@wolfgangcolsman
Copy link

Using Anthropic Bedrock does not work with llama 3.3 on AWS Bedrock.

anthropic.BadRequestError: Error code: 400 - {'message': 'Malformed input request: #: required key [prompt] not found#: extraneous key [system] is not permitted#: extraneous key [max_tokens] is not permitted#: extraneous key [messages] is not permitted#: extraneous key [tool_choice] is not permitted#: extraneous key [anthropic_version] is not permitted#: extraneous key [tools] is not permitted, please reformat your input and try again.'}

@PabloSimonSainz
Copy link

Using Anthropic Bedrock does not work with llama 3.3 on AWS Bedrock.

anthropic.BadRequestError: Error code: 400 - {'message': 'Malformed input request: #: required key [prompt] not found#: extraneous key [system] is not permitted#: extraneous key [max_tokens] is not permitted#: extraneous key [messages] is not permitted#: extraneous key [tool_choice] is not permitted#: extraneous key [anthropic_version] is not permitted#: extraneous key [tools] is not permitted, please reformat your input and try again.'}

To call models through aws bedrock, some models can be invoked through invoke_model and others with converse (for example anthropic/sonnet can only be called with converse).

As a note, another small consideration is the availability of the model in the region. models-regions

@PabloSimonSainz
Copy link

Regarding this issue, would there be any problem if it were addressed directly with the python aws SDK (boto3) with just the right dependencies?

@stephenhibbert
Copy link

To call models through aws bedrock, some models can be invoked through invoke_model and others with converse (for example anthropic/sonnet can only be called with converse).

That is incorrect. In fact, at the time of writing, the anthropic python SDK currently exclusively uses the invoke API and this works for the latest models: https://github.com/anthropics/anthropic-sdk-python/blob/93cbbbde964e244f02bf1bd2b579c5fabce4e267/src/anthropic/lib/bedrock/_client.py#L56. As well as the invoke API, there is also the newer converse API which provides a consistent interface and is supported for some of the newer models: https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html

At the moment there is a valid workaround for using Anthropic models on Bedrock in my comment above. The bigger unlock is implementing wider support for Bedrock models (Llama, Nova, Anthropic, Mistral etc) which I would suggest be implemented by using the converse API for simplicity and ease of long term support in PydanticAI. The popular new models will very likely all be supported with the converse API moving forwards.

@abtawfik
Copy link
Author

abtawfik commented Jan 6, 2025

The bigger unlock is implementing wider support for Bedrock models (Llama, Nova, Anthropic, Mistral etc) which I would suggest be implemented by using the converse API for simplicity and ease of long term support in PydanticAI. The popular new models will very likely all be supported with the converse API moving forwards.

Totally agree with this statement. The converse api is going to be the standard going forward for boto3.

@PabloSimonSainz
Copy link

To call models through aws bedrock, some models can be invoked through invoke_model and others with converse (for example anthropic/sonnet can only be called with converse).

That is incorrect. In fact, at the time of writing, the anthropic python SDK currently exclusively uses the invoke API and this works for the latest models: https://github.com/anthropics/anthropic-sdk-python/blob/93cbbbde964e244f02bf1bd2b579c5fabce4e267/src/anthropic/lib/bedrock/_client.py#L56. As well as the invoke API, there is also the newer converse API which provides a consistent interface and is supported for some of the newer models: https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html

At the moment there is a valid workaround for using Anthropic models on Bedrock in my comment above. The bigger unlock is implementing wider support for Bedrock models (Llama, Nova, Anthropic, Mistral etc) which I would suggest be implemented by using the converse API for simplicity and ease of long term support in PydanticAI. The popular new models will very likely all be supported with the converse API moving forwards.

I had that problem recently on a project with AWS Bedrock and had to use the converse API instead of the invoke API for anthropic models.

image_20250108_113423610a7e8d-38a8-4177-971d-110f5280a057.jpg

@stephenhibbert
Copy link

I had that problem recently on a project with AWS Bedrock and had to use the converse API instead of the invoke API for anthropic models.

image_20250108_113423610a7e8d-38a8-4177-971d-110f5280a057.jpg

@PabloSimonSainz It's hard to tell exactly what's wrong with your code without seeing the source but that error message suggests you're using the old style Anthropic Text Completions instead of the new Messages format. If you use the Messages API correctly with the invoke API then you can use the Haiku model and it will also work with the converse API. Hope that helps!

https://docs.anthropic.com/en/api/migrating-from-text-completions-to-messages
https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-anthropic-claude-messages.html#claude-messages-supported-models

@Wh1isper
Copy link

Wh1isper commented Jan 11, 2025

Found this project https://github.com/aws-samples/bedrock-access-gateway which using AWS converse API to proxy bedrock api as openai api.

from openai import OpenAI

client = OpenAI(
    api_key="...",
    base_url="...",
)

from pydantic_ai import Agent
from pydantic_ai.models.openai import OpenAIModel

model = OpenAIModel('us.amazon.nova-lite-v1:0', openai_client=client)
agent = Agent(model)

A few minor issues still, and I have a fork of my own https://github.com/Wh1isper/bedway

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
new models Support for new model(s)
Projects
None yet
Development

No branches or pull requests