-
-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to Get OpenHands to Work with AWS Bedrock Using Docker on macOS? #6647
Comments
I’ve tested invoking the us.anthropic.claude-3-5-sonnet-20241022-v2:0 model using both boto3 and litellm to compare their behavior and output. Here’s the code and the observed responses: import os
import json
from litellm import completion
import boto3
# Set AWS region and profile to ensure correct configuration
os.environ["AWS_REGION_NAME"] = "us-east-1"
os.environ["AWS_PROFILE"] = "openhands-bedrock"
# Optional: Enable DEBUG logging for more information
os.environ['LITELLM_LOG'] = 'DEBUG'
# Use the inference profile ID for both examples
model_id = "us.anthropic.claude-3-5-sonnet-20241022-v2:0"
# --- boto3 Example ---
print("Using boto3:")
# Initialize Bedrock client
session = boto3.Session(profile_name=os.environ["AWS_PROFILE"], region_name=os.environ["AWS_REGION_NAME"])
bedrock = session.client("bedrock-runtime")
# Prepare payload for boto3
payload = {
"modelId": model_id,
"contentType": "application/json",
"accept": "application/json",
"body": json.dumps({
"anthropic_version": "bedrock-2023-05-31",
"max_tokens": 200,
"top_k": 250,
"temperature": 1,
"top_p": 0.999,
"messages": [
{
"role": "user",
"content": [{"type": "text", "text": "hello world"}]
}
]
})
}
try:
response = bedrock.invoke_model(
modelId=payload["modelId"],
contentType=payload["contentType"],
accept=payload["accept"],
body=payload["body"]
)
result = json.loads(response["body"].read())
print("Model response from boto3:", result)
except Exception as e:
print(f"An error occurred with boto3: {e}")
# --- litellm Example ---
print("\nUsing litellm:")
try:
# Attempt the completion request with litellm
response = completion(
model=f"bedrock/{model_id}", # Prefix with 'bedrock/' for litellm
messages=[{"role": "user", "content": "hello world"}],
max_tokens=200,
top_k=250,
temperature=1,
top_p=0.999
)
print("Model response from litellm:", response)
except Exception as e:
print(f"An error occurred with litellm: {e}") Outputs
|
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I'm setting up OpenHands with AWS Bedrock on macOS using Docker, but encountering connection issues related to the Docker client and server API version. While some commands inside the container work, the main application fails with the following errors:
Docker Run Command:
Successfully ran commands inside the container:
docker exec -it openhands-app-bedrock /bin/bash
and insideipython
:Error Details:
The text was updated successfully, but these errors were encountered: