This is a SAM (Serverless Application Model) project designed for troubleshooting and testing Powertools for AWS Lambda (Pyton). The project includes a Lambda function connected to API Gateway for easy deployment and testing.
src/
- Contains the Lambda function code and dependenciessrc/app.py
- Main Lambda function handlersrc/requirements.txt
- Python dependencies
events/
- Sample invocation events for testingtemplate.yaml
- SAM template defining AWS resources (Lambda + API Gateway)
Before getting started, ensure you have the following installed:
- SAM CLI
- Python 3.9+
- Docker
- AWS CLI configured with appropriate credentials
To add Powertools for AWS Lambda to your project, update the src/requirements.txt
file:
requests
aws-lambda-powertools
For specific Powertools features, you can add:
requests
aws-lambda-powertools[all]
aws-lambda-powertools[tracer]
aws-lambda-powertools[logger]
aws-lambda-powertools[metrics]
aws-lambda-powertools[parser]
# Core Powertools
aws-lambda-powertools
# Additional useful libraries for troubleshooting
boto3
botocore
pydantic
Update src/app.py
to include Powertools features:
import json
from aws_lambda_powertools import Logger, Tracer, Metrics
from aws_lambda_powertools.logging import correlation_paths
from aws_lambda_powertools.metrics import MetricUnit
# Initialize Powertools
logger = Logger()
tracer = Tracer()
metrics = Metrics()
@logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST)
@tracer.capture_lambda_handler
@metrics.log_metrics
def lambda_handler(event, context):
logger.info("Lambda function invoked", extra={"event": event})
# Add custom metrics
metrics.add_metric(name="InvocationCount", unit=MetricUnit.Count, value=1)
try:
# Your business logic here
result = {"message": "Hello from Powertools!", "statusCode": 200}
logger.info("Function executed successfully", extra={"result": result})
return {
"statusCode": 200,
"body": json.dumps(result),
"headers": {
"Content-Type": "application/json"
}
}
except Exception as e:
logger.exception("Error processing request")
tracer.put_annotation(key="error", value=str(e))
return {
"statusCode": 500,
"body": json.dumps({"error": "Internal server error"}),
"headers": {
"Content-Type": "application/json"
}
}
Build your application using the SAM CLI:
sam build --use-container
This command installs dependencies from src/requirements.txt
and creates a deployment package.
The template is already configured to connect your Lambda function to API Gateway. Start the local API server:
sam local start-api
This starts a local API Gateway on http://localhost:3000
. The function will be available at any path since the template uses /{proxy+}
with ANY
method.
# Test with GET request
curl http://localhost:3000/
# Test with POST request
curl -X POST http://localhost:3000/test \
-H "Content-Type: application/json" \
-d '{"key": "value"}'
# Test with query parameters
curl "http://localhost:3000/hello?name=World"
Test the function directly with a test event:
sam local invoke HelloWorldFunction --event events/event.json
Create custom test events in the events/
folder:
{
"httpMethod": "GET",
"path": "/test",
"queryStringParameters": {
"param1": "value1"
},
"headers": {
"Content-Type": "application/json"
},
"body": "{\"test\": \"data\"}"
}
The events/
folder contains a comprehensive collection of sample events that integrate with AWS Lambda, covering most AWS services that can trigger Lambda functions. These events are useful for local testing and understanding the structure of different event sources.
The folder includes events for:
- API Gateway (REST API, HTTP API, WebSocket)
- S3 (Object events, EventBridge notifications)
- DynamoDB (Streams, tumbling windows)
- SQS (Standard, FIFO, DLQ triggers)
- SNS (Direct and SQS integration)
- Kinesis (Data Streams, Firehose)
- EventBridge (Custom events, scheduled events)
- Cognito (Authentication triggers)
- CloudWatch (Logs, alarms, dashboards)
- AppSync (Resolvers, authorizers)
- And many more...
Test your function with different event types:
# Test with API Gateway event
sam local invoke HelloWorldFunction --event events/apiGatewayProxyEvent.json
# Test with S3 event
sam local invoke HelloWorldFunction --event events/s3Event.json
# Test with SQS event
sam local invoke HelloWorldFunction --event events/sqsEvent.json
This allows you to test how your Powertools-enabled function handles different AWS service integrations without deploying to AWS.
Set environment variables for local testing:
# Enable debug logging
export POWERTOOLS_LOG_LEVEL=DEBUG
# Set service name for tracing
export POWERTOOLS_SERVICE_NAME=troubleshoot-service
# Start local API with environment variables
sam local start-api --env-vars env.json
Create an env.json
file:
{
"HelloWorldFunction": {
"POWERTOOLS_LOG_LEVEL": "DEBUG",
"POWERTOOLS_SERVICE_NAME": "troubleshoot-service",
"POWERTOOLS_METRICS_NAMESPACE": "TroubleshootApp"
}
}
Deploy your application to AWS:
sam build --use-container
sam deploy --guided
Follow the prompts to configure:
- Stack Name: Choose a unique name (e.g.,
powertools-troubleshoot
) - AWS Region: Your preferred region
- Confirm changes: Yes (recommended for first deployment)
- Allow IAM role creation: Yes
- Save parameters: Yes
After the first deployment, simply run:
sam build --use-container && sam deploy
The API Gateway endpoint URL will be displayed in the output after deployment.
If you encounter import errors:
# Rebuild with container to ensure proper dependencies
sam build --use-container
# Check if powertools is in requirements.txt
cat src/requirements.txt
# Check if Docker is running
docker ps
# Restart local API if needed
sam local start-api --debug
# Test with verbose output
sam local invoke HelloWorldFunction --event events/event.json --debug
Enable debug logging in your function:
import os
os.environ['POWERTOOLS_LOG_LEVEL'] = 'DEBUG'
Update the template.yaml
if you need more resources:
Globals:
Function:
Timeout: 30
MemorySize: 512 # Add this line
Read more here: https://docs.powertools.aws.dev/lambda/python/latest/build_recipes/
Lambda supports both x86_64 and arm64 architectures. Build issues often occur when dependencies are compiled for the wrong architecture.
# Error: ImportError: /var/task/lib/python3.x/site-packages/some_package.so: cannot open shared object file
# This usually indicates architecture mismatch
- Use SAM with Container Builds (Recommended):
# Always use container builds for consistent architecture
sam build --use-container
- Specify Architecture in Template:
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
Architectures:
- x86_64 # or arm64
# ... other properties
- Force Architecture in Docker Build:
# For ARM64 Lambda functions
sam build --use-container --build-image public.ecr.aws/sam/build-python3.9:latest-arm64
# For x86_64 Lambda functions
sam build --use-container --build-image public.ecr.aws/sam/build-python3.9:latest-x86_64
Common pip problems and solutions:
# Problem: pip install fails with compilation errors
# Solution: Use pre-compiled wheels or container builds
sam build --use-container
# Problem: Different pip versions causing issues
# Solution: Pin pip version in requirements.txt
pip==23.3.1
aws-lambda-powertools
If using uv
as a faster pip replacement:
# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Use uv in your build process
uv pip install -r requirements.txt --target ./package
Create a custom build script with uv:
#!/bin/bash
# build.sh
set -e
echo "Building with uv..."
uv pip install -r src/requirements.txt --target .aws-sam/build/HelloWorldFunction/
# Copy source code
cp -r src/* .aws-sam/build/HelloWorldFunction/
For projects using Poetry:
# pyproject.toml
[tool.poetry.dependencies]
python = "^3.9"
aws-lambda-powertools = "^2.0.0"
requests = "^2.31.0"
[build-system]
requires = ["poetry-core"]
build-backend = "poetry.core.masonry.api"
Build with Poetry:
# Export requirements for SAM
poetry export -f requirements.txt --output src/requirements.txt --without-hashes
# Then build normally
sam build --use-container
Some Python packages require compilation:
# Problematic packages that need compilation
numpy
pandas
Pillow
cryptography
Solutions:
- Use Lambda Layers:
# template.yaml
Resources:
PowertoolsLayer:
Type: AWS::Serverless::LayerVersion
Properties:
LayerName: powertools-layer
ContentUri: layers/powertools/
CompatibleRuntimes:
- python3.9
- python3.10
CompatibleArchitectures:
- x86_64
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
Layers:
- !Ref PowertoolsLayer
- Use Pre-built Layers:
# Use AWS-provided Powertools layer
Layers:
- arn:aws:lambda:us-east-1:017000801446:layer:AWSLambdaPowertoolsPythonV3:21
- Container Images (for complex dependencies):
# Dockerfile
FROM public.ecr.aws/lambda/python:3.9
COPY requirements.txt ${LAMBDA_TASK_ROOT}
RUN pip install -r requirements.txt
COPY src/ ${LAMBDA_TASK_ROOT}
CMD ["app.lambda_handler"]
Update template for container:
Resources:
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
PackageType: Image
ImageUri: your-account.dkr.ecr.region.amazonaws.com/your-repo:latest
# Problem: Building on Apple Silicon for x86_64 Lambda
# Solution: Use Docker with platform specification
docker run --platform manylinux2014_x86_64 -v $(pwd):/workspace -w /workspace python:3.9 pip install -r requirements.txt -t ./package
# Or use SAM with specific architecture
sam build --use-container --build-image public.ecr.aws/sam/build-python3.9:latest-x86_64
# Problem: Path separators and line endings
# Solution: Use WSL2 or Docker Desktop with Linux containers
sam build --use-container
# Ensure requirements.txt has Unix line endings
dos2unix src/requirements.txt
# Debug SAM build issues
sam build --use-container --debug
# Check build artifacts
ls -la .aws-sam/build/HelloWorldFunction/
# Error: "No module named '_ctypes'"
# Solution: Use container build or update base image
# Error: "Microsoft Visual C++ 14.0 is required"
# Solution: Use Linux container build on Windows
# Error: "Failed building wheel for package"
# Solution: Install build dependencies or use pre-compiled wheels
pip install --only-binary=all package-name
# Use build cache
export SAM_CLI_TELEMETRY=0
sam build --use-container --cached
# Parallel builds
sam build --use-container --parallel
# Skip dependency resolution if unchanged
sam build --use-container --skip-pull-image
The template uses a catch-all configuration that routes all requests to your Lambda function:
Events:
HelloWorld:
Type: Api
Properties:
Path: /{proxy+}
Method: ANY
This means your function will receive all HTTP methods (GET, POST, PUT, DELETE, etc.) on any path. Handle different routes in your Lambda function code:
def lambda_handler(event, context):
http_method = event.get('httpMethod')
path = event.get('path')
if http_method == 'GET' and path == '/health':
return health_check()
elif http_method == 'POST' and path == '/process':
return process_data(event)
else:
return {
"statusCode": 404,
"body": json.dumps({"error": "Not found"})
}
After depl, view your fun
# TTE`:logs in real-time
sam logs -n HelloWorldFunction --stack-name your-stack-name --tail
# View logs from the last 10 minutes
sam logs -n HelloWorldFunction --stack-name your-stack-name --start-time '10min ago'
# Filter logs by keyword
sam logs -n HelloWorldFunction --stack-name your-stack-name --filter ERROR
With Powertools Logger, your logs will be structured JSON:
{
"timestamp": "2024-01-15T10:30:00.000Z",
"level": "INFO",
"message": "Lambda function invoked",
"service": "troubleshoot-service",
"correlation_id": "abc-123-def",
"event": {...}
}
Enable X-Ray tracing in your template:
Globals:
Function:
Tracing: Active
from aws_lambda_powertools import Logger
from aws_lambda_powertools.logging import correlation_paths
logger = Logger()
@logger.inject_lambda_context(correlation_id_path=correlation_paths.API_GATEWAY_REST)
def lambda_handler(event, context):
logger.info("Processing request", extra={"user_id": "123"})
return {"statusCode": 200, "body": "Success"}
from aws_lambda_powertools import Metrics
from aws_lambda_powertools.metrics import MetricUnit
metrics = Metrics(namespace="TroubleshootApp")
@metrics.log_metrics
def lambda_handler(event, context):
metrics.add_metric(name="RequestCount", unit=MetricUnit.Count, value=1)
metrics.add_metadata(key="path", value=event.get("path"))
return {"statusCode": 200}
from aws_lambda_powertools.utilities.parser import parse
from aws_lambda_powertools.utilities.parser.models import APIGatewayProxyEvent
from pydantic import BaseModel
class UserRequest(BaseModel):
name: str
email: str
@parse(model=APIGatewayProxyEvent)
def lambda_handler(event: APIGatewayProxyEvent, context):
# Parse request body
user_data = UserRequest.parse_raw(event.body)
logger.info("User data received", extra={"user": user_data.dict()})
return {"statusCode": 200}
To delete the deployed application:
sam delete --stack-name your-stack-name