This document contains the Python code for a Lambda function that acts as a proxy to AWS Bedrock models, along with deployment instructions.
bedrock-proxy
)Your Lambda function needs permission to invoke Bedrock models. Add the following policy to your Lambda’s execution role:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": "*"
}
]
}
bedrock-invoke-policy
)The Lambda function is configured to use Bearer token authentication if an AUTH_TOKEN
environment variable is set:
# On Linux/Mac
openssl rand -base64 32
# On Windows PowerShell
[Convert]::ToBase64String([Security.Cryptography.RandomNumberGenerator]::GetBytes(24))
AUTH_TOKEN
with your generated token as the valueThe Lambda proxy supports both streaming and non-streaming requests. To use streaming:
"stream": true
to your request JSONdata: {"content": "chunk text"}\n\n
chunks and data: [DONE]\n\n
as the end signal{"content": "chunk text"}
and a final {"done": true}
objectFor complete details on streaming implementation, see bedrock-streaming-docs.md.
// Your AUTH_TOKEN from Lambda environment variables
const authToken = 'your-generated-token-here';
fetch('https://your-lambda-function-url-or-api-gateway-url', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${authToken}`
},
body: JSON.stringify({
// Required: specify which model to use
modelId: 'eu.anthropic.claude-3-5-sonnet-20240620-v1:0',
// Optional: set to true for streaming response
stream: false,
// The rest is the model-specific payload
anthropic_version: 'bedrock-2023-05-31',
max_tokens: 1000,
messages: [
{
role: 'user',
content: [
{
type: 'text',
text: 'What is the capital of France?'
}
]
}
]
})
})
.then(response => response.json())
.then(data => console.log(data))
.catch(error => console.error('Error:', error));
import requests
import json
url = "https://your-lambda-function-url-or-api-gateway-url"
auth_token = "your-generated-token-here" # Your AUTH_TOKEN from Lambda environment variables
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {auth_token}"
}
payload = {
"modelId": "eu.anthropic.claude-3-5-sonnet-20240620-v1:0",
"stream": false, # Set to true for streaming
"anthropic_version": "bedrock-2023-05-31",
"max_tokens": 1000,
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "What is the capital of France?"
}
]
}
]
}
response = requests.post(url, headers=headers, json=payload)
print(response.json())
Secure Storage: Never hardcode tokens in your application. Use environment variables, secure vaults, or configuration management solutions.
Token Rotation: Periodically update your tokens to reduce the risk of unauthorized access.
Least Privilege: Consider creating different tokens with different access levels if you have multiple applications or use cases.
Monitoring: Implement logging and monitoring to detect unusual patterns that might indicate token compromise.
Transport Security: Always use HTTPS to protect tokens in transit.
If you need more robust security, consider:
JWT Tokens: Implement JWT (JSON Web Token) validation for more sophisticated authentication with expiration and claims.
IP Restrictions: Add IP-based restrictions in your Lambda function to only accept requests from trusted sources.
AWS WAF: Deploy AWS WAF (Web Application Firewall) in front of your API Gateway for additional protection.
For invoking different models, you’ll need to adjust the payload format according to the model’s requirements. The Claude model format is shown in the examples above.