A ready-to-run example is available here !
Use AWS Bedrock to access Claude and other foundation models through your AWS account.
Prerequisites
AWS Bedrock requires the boto3 library:
pip install openhands-sdk boto 3> = 1.28.57
boto3 is used internally by LiteLLM - you don’t need to import it in your code.
Authentication Options
Configure AWS credentials using environment variables:
Method Environment Variables Access Keys AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEYSession Token AWS_SESSION_TOKEN (in addition to access keys)AWS Profile AWS_PROFILE_NAMEIAM Role AWS_ROLE_NAME, AWS_WEB_IDENTITY_TOKENBedrock API Key AWS_BEARER_TOKEN_BEDROCK
You must also set the AWS region:
export AWS_REGION_NAME = "us-east-1"
Model Names
Use the bedrock/ prefix followed by the Bedrock model ID:
Model Model ID Claude 3.5 Sonnet v2 bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0Claude 3 Opus bedrock/anthropic.claude-3-opus-20240229-v1:0Claude 3 Haiku bedrock/anthropic.claude-3-haiku-20240307-v1:0Claude 3.5 Haiku bedrock/anthropic.claude-3-5-haiku-20241022-v1:0
Basic Usage
import os
from openhands.sdk import LLM , Agent, Conversation
from openhands.sdk.tool import Tool
from openhands.tools.terminal import TerminalTool
from openhands.tools.file_editor import FileEditorTool
llm = LLM( model = "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0" )
agent = Agent(
llm = llm,
tools = [
Tool( name = TerminalTool.name),
Tool( name = FileEditorTool.name),
],
)
conversation = Conversation( agent = agent, workspace = os.getcwd())
conversation.send_message( "List the files in this directory" )
conversation.run()
Ready-to-run Example
Before running, ensure you have:
Installed boto3>=1.28.57
Set AWS credentials via environment variables
Enabled the model in your AWS Bedrock console
"""
AWS Bedrock with Claude models.
Prerequisites:
pip install openhands-sdk boto3>=1.28.57
Environment variables:
AWS_ACCESS_KEY_ID - Your AWS access key
AWS_SECRET_ACCESS_KEY - Your AWS secret key
AWS_REGION_NAME - AWS region (e.g., us-east-1)
"""
import os
from openhands.sdk import LLM , Agent, Conversation
from openhands.sdk.tool import Tool
from openhands.tools.terminal import TerminalTool
from openhands.tools.file_editor import FileEditorTool
# AWS credentials are read from environment variables by LiteLLM/boto3
# AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME
llm = LLM(
model = os.getenv( "LLM_MODEL" , "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0" ),
)
agent = Agent(
llm = llm,
tools = [
Tool( name = TerminalTool.name),
Tool( name = FileEditorTool.name),
],
)
conversation = Conversation( agent = agent, workspace = os.getcwd())
conversation.send_message( "List the files in this directory and summarize what you see." )
conversation.run()
# Report cost
cost = llm.metrics.accumulated_cost
print ( f "EXAMPLE_COST: { cost } " )
See all 40 lines
Running the Example
# Set AWS credentials
export AWS_ACCESS_KEY_ID = "your-access-key"
export AWS_SECRET_ACCESS_KEY = "your-secret-key"
export AWS_REGION_NAME = "us-east-1"
# Run the example
python bedrock_example.py
Cross-Region Inference
AWS Bedrock supports cross-region inference. Include the region prefix in your model ID:
llm = LLM( model = "bedrock/us.anthropic.claude-3-5-sonnet-20241022-v2:0" )
Troubleshooting
Access Denied
Ensure your AWS credentials have the bedrock:InvokeModel permission.
Model Not Found
Verify the model is enabled in your AWS account:
Go to AWS Console > Bedrock > Model access
Enable the models you want to use
Region Issues
Ensure AWS_REGION_NAME is set to a region where:
Bedrock is available
You have model access enabled
Next Steps