Skip to main content
A ready-to-run example is available here!
Use AWS Bedrock to access Claude and other foundation models through your AWS account.

Prerequisites

AWS Bedrock requires the boto3 library:
pip install openhands-sdk boto3>=1.28.57
boto3 is used internally by LiteLLM - you don’t need to import it in your code.

Authentication Options

Configure AWS credentials using environment variables:
MethodEnvironment Variables
Access KeysAWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY
Session TokenAWS_SESSION_TOKEN (in addition to access keys)
AWS ProfileAWS_PROFILE_NAME
IAM RoleAWS_ROLE_NAME, AWS_WEB_IDENTITY_TOKEN
Bedrock API KeyAWS_BEARER_TOKEN_BEDROCK
You must also set the AWS region:
export AWS_REGION_NAME="us-east-1"

Model Names

Use the bedrock/ prefix followed by the Bedrock model ID:
ModelModel ID
Claude 3.5 Sonnet v2bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0
Claude 3 Opusbedrock/anthropic.claude-3-opus-20240229-v1:0
Claude 3 Haikubedrock/anthropic.claude-3-haiku-20240307-v1:0
Claude 3.5 Haikubedrock/anthropic.claude-3-5-haiku-20241022-v1:0

Basic Usage

import os
from openhands.sdk import LLM, Agent, Conversation
from openhands.sdk.tool import Tool
from openhands.tools.terminal import TerminalTool
from openhands.tools.file_editor import FileEditorTool

llm = LLM(model="bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0")

agent = Agent(
    llm=llm,
    tools=[
        Tool(name=TerminalTool.name),
        Tool(name=FileEditorTool.name),
    ],
)

conversation = Conversation(agent=agent, workspace=os.getcwd())
conversation.send_message("List the files in this directory")
conversation.run()

Ready-to-run Example

Before running, ensure you have:
  1. Installed boto3>=1.28.57
  2. Set AWS credentials via environment variables
  3. Enabled the model in your AWS Bedrock console
"""
AWS Bedrock with Claude models.

Prerequisites:
    pip install openhands-sdk boto3>=1.28.57

Environment variables:
    AWS_ACCESS_KEY_ID       - Your AWS access key
    AWS_SECRET_ACCESS_KEY   - Your AWS secret key
    AWS_REGION_NAME         - AWS region (e.g., us-east-1)
"""
import os

from openhands.sdk import LLM, Agent, Conversation
from openhands.sdk.tool import Tool
from openhands.tools.terminal import TerminalTool
from openhands.tools.file_editor import FileEditorTool

# AWS credentials are read from environment variables by LiteLLM/boto3
# AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION_NAME

llm = LLM(
    model=os.getenv("LLM_MODEL", "bedrock/anthropic.claude-3-5-sonnet-20241022-v2:0"),
)

agent = Agent(
    llm=llm,
    tools=[
        Tool(name=TerminalTool.name),
        Tool(name=FileEditorTool.name),
    ],
)

conversation = Conversation(agent=agent, workspace=os.getcwd())
conversation.send_message("List the files in this directory and summarize what you see.")
conversation.run()

# Report cost
cost = llm.metrics.accumulated_cost
print(f"EXAMPLE_COST: {cost}")

Running the Example

# Set AWS credentials
export AWS_ACCESS_KEY_ID="your-access-key"
export AWS_SECRET_ACCESS_KEY="your-secret-key"
export AWS_REGION_NAME="us-east-1"

# Run the example
python bedrock_example.py

Cross-Region Inference

AWS Bedrock supports cross-region inference. Include the region prefix in your model ID:
llm = LLM(model="bedrock/us.anthropic.claude-3-5-sonnet-20241022-v2:0")

Troubleshooting

Access Denied

Ensure your AWS credentials have the bedrock:InvokeModel permission.

Model Not Found

Verify the model is enabled in your AWS account:
  1. Go to AWS Console > Bedrock > Model access
  2. Enable the models you want to use

Region Issues

Ensure AWS_REGION_NAME is set to a region where:
  • Bedrock is available
  • You have model access enabled

Next Steps