This script has been developed to produce a csv format report on EC2 instances across multiple accounts utilising AWS roles. The output file will be a single csv which contains all instances across the accounts provided. The csv file will be automatically uploaded to S3. The report will only retrieve attributes and tags defined in the Lambda environment variables
All of the accounts you are assuming the role in will require the roll defined. They must all use the same role name
The role must trust the account which the assume is being performed and have at a minimum Read Only EC2 access
Create a role which can be used by Lambda. The role applied to Lambda must at a minimum have;
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Resource": "arn:aws:iam::account_number:role/role_name"
}
]
}
The Resource can be made into a list to define multiple accounts
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": "*"
}
]
}
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListAllMyBuckets",
"s3:GetBucketLocation"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": "s3:PutObject",
"Resource": [
"arn:aws:s3:::bucket_name",
"arn:aws:s3:::bucket_name/*"
]
}
]
}
The code has been deployed as Python 3.8 (latest at time of writing) within Lambda
The timeout value will depend on the number of accounts and EC2 instances you are reporting on. I started with 10 seconds which was sufficient.
Configure a test event and have nothing define in the json, e.g.
{}
To ensure the code can easily be re-used I have set all the key elements as variables. These can then be updated by anyone without having to get involved in the coding side. e.g. adding a new account or tag can easily be done by updating the comma separated list for the item within the Lambda environment variables.
Key | Value |
---|---|
arole | role name to assume |
attributes | (comma separated list) e.g. OwnerId,InstanceId,InstanceType |
tags | (comma separated list) e.g. Name,Project,Release,Environment,CostCentre |
aws_accounts | (comma separated list) e.g. 1111111111111,2222222222222,3333333333333 |
bucket_name | name of bucket e.g. reporting_test_bucket |
output_path | folder within the bucket where the report will be held. E.g. ec2_reports |
Please use the following steps to define a schedule trigger to generate the log files
- On the AWS Console, navigate to CloudWatch. Then select the Events | Rules
- Select the Create Rule button
- Select the schedule radio button and define your schedule. We used a cron expression 00 03 ? * * * This executed the code at 3am everyday of the week
- Select the add target button
- Choose Lambda function and then select your function from the drop down list
- Select the Configure details button
- Give the rule a meaningful name and description
- Select the Create rule button
Unless you have any contractual or regulatory requirements to retain x amount of logs, you could introduce a Lifecycle rule to the s3 bucket to remove objects older than x days.
None
Dave Hart