-
Notifications
You must be signed in to change notification settings - Fork 15.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue: how to configure Amazon Bedrock #10182
Comments
🤖 Hello, Thank you for your question. Here's a step-by-step guide on how to configure Amazon Bedrock with the LangChain Python framework:
pip install boto3
from langchain.llms.bedrock import Bedrock
from langchain.schema import HumanMessage
chat = Bedrock(model_id="anthropic.claude-v2", model_kwargs={"temperature":0.1})
messages = [
HumanMessage(
content="Translate this sentence from English to French. I love programming."
)
]
chat(messages) In the context shared, the The from langchain.embeddings.bedrock import BedrockEmbeddings
be = BedrockEmbeddings(
credentials_profile_name="default",
region_name="us-east-1",
model_id="amazon.titan-e1t-medium"
) This embeddings = be.embed_documents(["This is a test document."]) So, in addition to interacting with AWS Lambda and Amazon S3, the Please let me know if you have any other questions or need further clarification. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Langchain 0.0.281 (released on 9/4/23) just added additional Bedrock support, namely BedrockChat. Not sure why you are attempting to use 0.0.249, but would recommend you try the latest version. Please note that streaming is not yet supported, and it is possible that API responses may be truncated as a result. This is just an observation and I could be wrong about this though. |
Hi, @shenshaoyong, I'm helping the LangChain team manage their backlog and am marking this issue as stale. From what I understand, I provided a detailed guide on configuring Amazon Bedrock with the LangChain Python framework, including the installation of the boto3 library and its usage in the LangChain framework. Additionally, ctodd mentioned that Langchain 0.0.281 added additional Bedrock support, specifically BedrockChat, and recommended using the latest version. Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, please let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you for your understanding and cooperation. |
Issue you'd like to raise.
how to configure Amazon Bedrock step by step:
Answers:
#1. Open terminal, install bedrock specific version boto3 SDK,langchain
curl -sS https://d2eo22ngex1n9g.cloudfront.net/Documentation/SDK/bedrock-python-sdk.zip > sdk.zip
sudo yum install unzip -y
unzip sdk.zip -d sdk
pip install --no-build-isolation --force-reinstall ./sdk/awscli--py3-none-any.whl ./sdk/boto3--py3-none-any.whl ./sdk/botocore-*-py3-none-any.whl
pip install --quiet langchain==0.0.249
#pip install 'jupyter-ai>=1.0,<2.0' # If you use JupyterLab 3�pip install jupyter-ai # If you use JupyterLab 4
#2. change the default token count to 2048
vi ~/anaconda3/lib/python3.11/site-packages/langchain/llms/bedrock.py
change this line: input_body["max_tokens_to_sample"] = 2048
#3. aws configure default profile, make sure the aksk has enough permissions(BedrockFullAccess)
aws configure
#4.run %%ai in *.ipynb file on ec2/local machine [also can run in VSCODE] instead of SageMaker notebook instance / SageMaker Studio
%load_ext jupyter_ai
%%ai bedrock:anthropic.claude-v2
Write something about Amazon
Suggestion:
No response
The text was updated successfully, but these errors were encountered: