Interacting with Amazon Bedrock Model through API Gateway and Lambda Functions
Introduction:
In this post, we will explore how to interact with an Amazon Bedrock model through a secured API Gateway and Lambda function. The API Gateway is secured using a Lambda Authorizer, ensuring that only authorized requests can access the Bedrock model. This setup provides a scalable and secure way to integrate machine learning models into applications.
About the Project:
In this project, we have set up an Amazon Bedrock model, an API Gateway, and two Lambda functions: the “Authorizer” Lambda function, which acts as an access gatekeeper, and the “Main” Lambda function, which sends requests to the Bedrock model. We also use the Systems Manager Parameter Store for storing the authorization token securely.
Currently, AWS does not support AWS CLI access to the Amazon Bedrock service directly, and granting access to models must be done via the AWS Console. The token in the Parameter Store was created using AWS CLI because CloudFormation does not yet support the creation of parameters with the SecureString type. For more information, see the AWS CloudFormation SSM Parameter documentation.
All other resources — Lambda functions and API Gateway — were created using CloudFormation, ensuring that the infrastructure is managed as code and can be easily deployed and maintained.
This project is based on my other project about serverless architecture using API Gateway, Lambda Authorizer, and Secrets Manager. In current version, we store tokens in Parameter Store instead of Secrets Manager for lower cost and added the necessary integration between the Main Lambda function and the Bedrock model.
The Main Lambda function and IAM role configuration in infrastructure/root.yaml
CloudFormation template:
Parameters:
BedrockModelId:
Type: String
Default: ''
Resources:
MainLambdaFunction:
Type: AWS::Lambda::Function
Properties:
FunctionName: MainLambdaFunction
Description: Make requests to Bedrock models
Runtime: python3.12
Handler: index.lambda_handler
Role: !GetAtt MainLambdaExecutionRole.Arn
Timeout: 30
MemorySize: 512
Environment:
Variables:
BEDROCK_MODEL_ID: !Ref BedrockModelId
Code:
ZipFile: |
import json
import boto3
from botocore.exceptions import ClientError
bedrock_runtime = boto3.client('bedrock-runtime')
def lambda_handler(event, context):
try:
model_id = os.environ['BEDROCK_MODEL_ID']
# Validate the input
input_text = event.get("queryStringParameters", {}).get("inputText")
if not input_text:
raise ValueError("Input text is required in the request query parameters.")
# Prepare the payload for invoking the Bedrock model
payload = json.dumps({
"inputText": input_text,
"textGenerationConfig": {
"maxTokenCount": 8192,
"stopSequences": [],
"temperature": 0,
"topP": 1
}
})
# Invoke the Bedrock model
response = bedrock_runtime.invoke_model(
modelId=model_id,
contentType="application/json",
accept="application/json",
body=payload
)
# Check if the 'body' exists in the response and handle it correctly
if 'body' not in response or not response['body']:
raise ValueError("Response body is empty.")
response_body = json.loads(response['body'].read().decode('utf-8'))
return {
'statusCode': 200,
'body': json.dumps(response_body)
}
except ClientError as e:
return {
'statusCode': 500,
'body': json.dumps({"error": "Error interacting with the Bedrock API"})
}
except ValueError as e:
return {
'statusCode': 400,
'body': json.dumps({"error": str(e)})
}
except Exception as e:
return {
'statusCode': 500,
'body': json.dumps({"error": "Internal Server Error"})
}
MainLambdaExecutionRole:
Type: AWS::IAM::Role
Properties:
RoleName: MainLambdaExecutionRole
AssumeRolePolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Principal:
Service:
- lambda.amazonaws.com
Action:
- sts:AssumeRole
ManagedPolicyArns:
- arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole
Policies:
- PolicyName: BedrockAccessPolicy
PolicyDocument:
Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- bedrock:InvokeModel
- bedrock:ListFoundationModels
Resource: '*'
Prerequisites:
Before you start, make sure the following requirements are met:
- An AWS account with permissions to create resources.
- AWS CLI installed on your local machine.
Deployment:
- Clone the repository.
git clone https://gitlab.com/Andr1500/api-gateway-lambda-bedrock.git
2. Set up Amazon Bedrock model.
Go to the Amazon Bedrock service in the AWS Console.
Navigate to Get Started -> Request model access -> Modify model access -> Choose the appropriate model available in your region (e.g., Titan Text G1 — Express) -> Next -> Submit.
Wait a few minutes and refresh the page to see “Access granted”.
Navigate to Overview -> choose Provider of the available model -> choose the model -> see the API request of the model. For different models the API request configuration can be different.
3. Create authorization token in AWS Systems Manager Parameter Store.
aws ssm put-parameter --name "AuthorizationLambdaToken" --value "token_value_secret" --type "SecureString"
4. Fill in all necessary Parameters in infrastructure/root.yaml
CloudFormation template, and scripts/retrieve_invoke_url.sh
script, and create the CloudFormation stack.
aws cloudformation create-stack \
--stack-name apigw-lambda-bedrock \
--template-body file://infrastructure/root.yaml \
--capabilities CAPABILITY_NAMED_IAM \
--disable-rollback
5. Retrieve the Invoke URL of the Stage using the scripts/retrieve_invoke_url.sh
script.
6. Test the Bedrock model, Main Lambda function, and API request from CLI. It is necessary to have a workaround with saving content response to some file because, according to the documentation for aws bedrock-runtime invoke-model
and aws lambda invoke
, the outfile
option is required.
7. Delete token from SM Parameter Store, and the CloudFormation stack.
aws ssm delete-parameter --name "AuthorizationLambdaToken"
aws cloudformation delete-stack --stack-name apigw-lambda-bedrock
Conclusion:
By leveraging API Gateway, Lambda functions, and Amazon Bedrock models, we can create a scalable and efficient solution for deploying machine learning models in a serverless environment. With the addition of a Lambda Authorizer, this solution is more secure, protecting against unauthorized access.