Organizations with multiple Amazon Web Services (AWS) accounts face significant challenges in maintaining consistent backup security policies. Without centralized visibility, security teams encountered obstacles when attempting to verify that all backup vaults have appropriate immutability controls enabled, leaving potential security gaps that could be an issue. According to AWS security best practices, immutable backups are a fundamental defense against ransomware events, which have increased by over 300% in recent years.
Protecting backup data is a critical component of organizational security strategy. AWS Backup Vault Lock provides robust immutability controls that prevent backup deletion—even by users with administrative privileges—offering essential protection against ransomware events and insider threats. For organizations managing multiple AWS accounts, maintaining consistent vault lock policies across the enterprise is crucial for promoting compliance and establishing a strong security posture.
In this post, we show how to implement automated reporting for AWS Backup Vault Lock status across accounts in your organization. The solution uses AWS Organizations, the AWS Command Line Interface (AWS CLI), and cross-account access to create vault lock compliance reports.
Solution overview
This solution uses Organizations, AWS Identity and Access Management (IAM), and cross-account access patterns to create a unified view of your backup vault lock status across your enterprise. The implementation creates a secure, scalable framework that collects and analyzes vault lock information from the accounts and AWS Regions in your organization.
The architecture consists of three primary components:
- Central management account configuration – The solution establishes an IAM administrative role in your central account that orchestrates the reporting process. This role has carefully scoped permissions to access organization data and assume roles in member accounts, maintaining the principle of least privilege while enabling cross-account functionality.
- Member account access framework – Each member account in your organization is configured with a specialized IAM role that grants the central account limited permissions to query backup vault information. This role-based approach provides secure, controlled access without requiring permanent credentials or excessive permissions.
- Automated reporting – A Python-based reporting engine collects and analyzes vault lock information from your organization’s accounts and Regions. It uses multi-threading for efficient processing and generates reports in multiple formats for easy consumption.
The following diagram shows the solution architecture.

Prerequisites
To perform the solution, you must have the following prerequisites:
Configure IAM roles for cross-account access
To configure IAM roles for cross-account access, complete the following high-level steps:
- Create an IAM role in your central account.
- Create an IAM role in each member account.
Create an IAM role in your central account
Create an administrative role in your central account that will orchestrate the vault lock reporting process:
- On the IAM console, choose Roles in the navigation pane, then choose Create role.
- For Trusted entity type, select AWS service.
- For Use case, choose Lambda.

- Name the role (for example,
AWS_Backup_Admin_Role
) and attach the following permissions policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"organizations:ListAccounts",
"sts:AssumeRole",
"ec2:DescribeRegions"
],
"Resource": ""
}
]
}
Create an IAM role in each member account
Create a role in each member account that the central account can assume:
- On the IAM console, choose Roles in the navigation pane, then choose Create role.
- For Trusted entity type, select AWS account.
- For An AWS account, select Another AWS account.
- Enter your central account ID.

- Name the role (for example,
OrganizationAccountAccessRole
) and attach the following trust policy, replacing 123456789 with your central account ID:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::123456789:role/Lambda-AWS-Backup-Org",
"arn:aws:iam::123456789:role/Admin"
]
},
"Action": "sts:AssumeRole",
"Condition": {}
}
]
}
- Attach the following permissions policy to the role:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"backup:ListBackupVaults",
"backup:DescribeBackupVault"
],
"Resource": "*"
}
]
}
Implement the Python script for cross-account vault lock reporting
To implement the Python script for cross-account vault lock reporting, complete the following high-level steps:
- Create the monitoring script.
- Run the monitoring script.
- Review the generated reports.
Create the monitoring script
To create the monitoring script, complete the following steps:
- Create a new Python file named
backup_vault_monitor.py
.
- Copy the following provided sample Python code into this file:
import boto3
import json
import time
import os
import csv
from botocore.config import Config
from concurrent.futures import ThreadPoolExecutor
from botocore.exceptions import ClientError
from datetime import datetime
# Set SDK configuration
os.environ['AWS_SDK_LOAD_CONFIG'] = '1'
def get_all_regions():
"""Get list of all AWS regions"""
ec2_client = boto3.client('ec2')
try:
response = ec2_client.describe_regions()
return [region['RegionName'] for region in response['Regions']]
except ClientError as e:
print(f"Error getting regions: {e}")
return ['us-east-1']
def get_linked_accounts():
"""Fetch all linked accounts from AWS Organizations"""
org_client = boto3.client('organizations')
accounts = []
try:
paginator = org_client.get_paginator('list_accounts')
for page in paginator.paginate():
accounts.extend(page['Accounts'])
return accounts
except ClientError as e:
print(f"Error fetching accounts: {e}")
return []
def assume_role(account_id, max_retries=3, region_name='us-east-1'):
"""Assume role in the target account with retry logic"""
config = Config(
signature_version='v4',
retries=dict(max_attempts=3)
)
sts_client = boto3.client('sts',
region_name=region_name,
config=config)
for attempt in range(max_retries):
try:
response = sts_client.assume_role(
RoleArn=f'arn:aws:iam::{account_id}:role/OrganizationAccountAccessRole',
RoleSessionName=f'BackupVaultCheck-{attempt}',
DurationSeconds=3600
)
return response['Credentials']
except ClientError as e:
if attempt == max_retries - 1:
print(f"Error assuming role in account {account_id} after {max_retries} attempts: {e}")
return None
print(f"Retry {attempt + 1} for assuming role in account {account_id}")
time.sleep(2)
return None
def assume_role_with_regional_sts(account_id, region, max_retries=3):
"""Assume role using regional STS endpoint"""
config = Config(
signature_version='v4',
retries=dict(max_attempts=3)
)
sts_client = boto3.client('sts',
region_name=region,
config=config,
endpoint_url=f'https://sts.{region}.amazonaws.com')
for attempt in range(max_retries):
try:
response = sts_client.assume_role(
RoleArn=f'arn:aws:iam::{account_id}:role/OrganizationAccountAccessRole',
RoleSessionName=f'BackupVaultCheck-{region}-{attempt}',
DurationSeconds=3600
)
print(f"Successfully assumed role in region {region} for account {account_id}")
return response['Credentials']
except ClientError as e:
if attempt == max_retries - 1:
print(f"Error assuming role in account {account_id} for region {region} after {max_retries} attempts: {e}")
return None
print(f"Retry {attempt + 1} for assuming role in account {account_id} for region {region}")
time.sleep(2)
return None
def get_backup_vaults_for_region(backup_client, account_id, region):
"""Get backup vaults for a specific region"""
vault_details = []
try:
paginator = backup_client.get_paginator('list_backup_vaults')
for page in paginator.paginate():
for vault in page['BackupVaultList']:
try:
vault_info = backup_client.describe_backup_vault(
BackupVaultName=vault['BackupVaultName']
)
vault_detail = {
'AccountId': account_id,
'Region': region,
'BackupVaultName': vault_info['BackupVaultName'],
'BackupVaultArn': vault_info['BackupVaultArn'],
'CreationDate': vault_info['CreationDate'].isoformat(),
'NumberOfRecoveryPoints': vault_info.get('NumberOfRecoveryPoints', 0),
'Locked': vault_info.get('Locked', False),
'MinRetentionDays': vault_info.get('MinRetentionDays'),
'MaxRetentionDays': vault_info.get('MaxRetentionDays')
}
vault_details.append(vault_detail)
except ClientError as e:
print(f"Error getting vault details for {vault['BackupVaultName']} in account {account_id}, region {region}: {e}")
return vault_details
except ClientError as e:
print(f"Error listing vaults in account {account_id}, region {region}: {e}")
return []
def get_backup_vaults(credentials, account_id):
"""Get all backup vaults and their lock status for an account across all regions"""
if not credentials:
return []
regions = get_all_regions()
vault_details = []
targeted_regions = ['af-south-1', 'ap-southeast-5']
for region in regions:
print(f"Processing region {region} for account {account_id}")
try:
if region in targeted_regions:
regional_credentials = assume_role_with_regional_sts(account_id, region)
if not regional_credentials:
continue
backup_client = boto3.client(
'backup',
region_name=region,
aws_access_key_id=regional_credentials['AccessKeyId'],
aws_secret_access_key=regional_credentials['SecretAccessKey'],
aws_session_token=regional_credentials['SessionToken'],
config=Config(
signature_version='v4',
retries=dict(max_attempts=3)
)
)
else:
backup_client = boto3.client(
'backup',
region_name=region,
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
config=Config(
signature_version='v4',
retries=dict(max_attempts=3)
)
)
max_retries = 3
success = False
for attempt in range(max_retries):
try:
backup_client.list_backup_vaults(MaxResults=1)
success = True
break
except ClientError as e:
if 'UnrecognizedClientException' in str(e) and attempt < max_retries - 1:
time.sleep(2)
continue
if attempt == max_retries - 1:
print(f"Error testing backup client in region {region} after {max_retries} attempts: {e}")
break
except Exception as e:
print(f"Unexpected error testing backup client in region {region}: {e}")
break
if success:
region_vaults = get_backup_vaults_for_region(backup_client, account_id, region)
vault_details.extend(region_vaults)
except ClientError as e:
error_code = getattr(e, 'response', {}).get('Error', {}).get('Code', '')
error_message = getattr(e, 'response', {}).get('Error', {}).get('Message', '')
print(f"Error processing region {region} in account {account_id}. Code: {error_code}, Message: {error_message}")
except Exception as e:
print(f"Unexpected error processing region {region} in account {account_id}: {e}")
return vault_details
def process_account(account):
"""Process a single account"""
account_id = account['Id']
account_name = account['Name']
print(f"Processing account: {account_name} ({account_id})")
credentials = assume_role(account_id, region_name='us-east-1')
if credentials:
return get_backup_vaults(credentials, account_id)
return []
def format_table(vault_details):
"""Format vault details into a table string"""
headers = {
'Account ID': 15,
'Region': 15,
'Vault Name': 30,
'Locked': 10,
'Min Days': 10,
'Max Days': 10
}
header_line = ''
separator_line = ''
for header, width in headers.items():
header_line += f"{header:<{width}}"
separator_line += '-' * width
rows = []
for vault in vault_details:
row = (
f"{vault['AccountId']:<{headers['Account ID']}}"
f"{vault['Region']:<{headers['Region']}}"
f"{vault['BackupVaultName']:<{headers['Vault Name']}}"
f"{str(vault['Locked']):<{headers['Locked']}}"
f"{str(vault.get('MinRetentionDays', 'N/A')):<{headers['Min Days']}}"
f"{str(vault.get('MaxRetentionDays', 'N/A')):<{headers['Max Days']}}"
)
rows.append(row)
table = f"nBackup Vault Summary:n{'='*90}n{header_line}n{separator_line}n"
table += 'n'.join(rows)
return table
def create_csv(vault_details):
"""Convert vault details to CSV format"""
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
csv_filename = f'backup_vaults_inventory_{timestamp}.csv'
headers = [
'Account ID',
'Region',
'Vault Name',
'Vault ARN',
'Creation Date',
'Number of Recovery Points',
'Locked',
'Min Retention Days',
'Max Retention Days'
]
try:
with open(csv_filename, 'w', newline='') as csvfile:
writer = csv.DictWriter(csvfile, fieldnames=headers)
writer.writeheader()
for vault in vault_details:
writer.writerow({
'Account ID': vault['AccountId'],
'Region': vault['Region'],
'Vault Name': vault['BackupVaultName'],
'Vault ARN': vault['BackupVaultArn'],
'Creation Date': vault['CreationDate'],
'Number of Recovery Points': vault['NumberOfRecoveryPoints'],
'Locked': vault['Locked'],
'Min Retention Days': vault.get('MinRetentionDays', 'N/A'),
'Max Retention Days': vault.get('MaxRetentionDays', 'N/A')
})
return csv_filename
except Exception as e:
print(f"Error creating CSV file: {e}")
return None
def save_to_file(formatted_output, all_vault_details):
"""Save output to files"""
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
# Save text report
report_filename = f'backup_vaults_report_{timestamp}.txt'
with open(report_filename, 'w') as f:
f.write(formatted_output)
# Save JSON inventory
json_filename = f'backup_vaults_inventory_{timestamp}.json'
with open(json_filename, 'w') as f:
json.dump(all_vault_details, f, indent=2, default=str)
# Save CSV inventory
csv_filename = create_csv(all_vault_details)
return report_filename, json_filename, csv_filename
def main():
try:
print("Fetching linked accounts...")
accounts = get_linked_accounts()
if not accounts:
print("No accounts found or error fetching accounts")
return
print(f"Found {len(accounts)} accounts")
all_vault_details = []
with ThreadPoolExecutor(max_workers=5) as executor:
results = list(executor.map(process_account, accounts))
for result in results:
all_vault_details.extend(result)
all_vault_details.sort(key=lambda x: (x['AccountId'], x['Region'], x['BackupVaultName']))
formatted_table = format_table(all_vault_details)
total_vaults = len(all_vault_details)
locked_vaults = len([v for v in all_vault_details if v['Locked']])
unlocked_vaults = len([v for v in all_vault_details if not v['Locked']])
summary = f"""
nSummary Statistics:
{'-'*20}
Total vaults: {total_vaults}
Locked vaults: {locked_vaults}
Unlocked vaults: {unlocked_vaults}
"""
formatted_output = formatted_table + summary
print(formatted_output)
report_file, json_file, csv_file = save_to_file(formatted_output, all_vault_details)
print(f"nDetailed report saved to: {report_file}")
print(f"JSON inventory saved to: {json_file}")
if csv_file:
print(f"CSV inventory saved to: {csv_file}")
except Exception as e:
print(f"Error in main execution: {str(e)}")
if __name__ == "__main__":
main()
- Save the file.
The script performs several key functions:
- Discovers accounts in your Organizations account
- Assumes the role in each member account
- Queries Regions for backup vaults
- Collects detailed information about each vault’s lock status
- Generates reports
Run the monitoring script
To run the monitoring script, follow these steps:
- Open a terminal or command prompt.
- Navigate to the directory containing the script.
- Confirm that you have AWS credentials configured for your central account.
- Run the script:
python backup_vault_monitor.py
During execution, the script will display progress information as it processes each account and Region, as shown in the following screenshot.

Review the generated reports
The script generates three output files:
- A text report with summary statistics
- A JSON inventory for programmatic analysis
- A CSV file for import into spreadsheets or databases
The reports include the following information:
- Account ID and name
- Region
- Backup vault name and Amazon Resource Name (ARN)
- Lock status (locked or unlocked)
- Minimum and maximum retention periods
For example, the summary might look like the following screenshots.

The following screenshot is an example of the output CSV file.

Customize the solution
Because organizations have different requirements and environments, you might want to modify the script to better suit your needs. The following sections explain some key areas you can customize.
Add alert thresholds
You can enhance the script with custom alert thresholds to proactively monitor compliance levels across your organization’s backup vaults. For example, you can configure alerts when the percentage of unlocked vaults exceeds a defined threshold or when specific accounts fall below your organization’s compliance requirements:
def send_alert(message):
"""Send alert for compliance issues"""
print(f"n:warning: ALERT: {message}")
# You can add additional alert mechanisms here (e.g., SNS, email, Slack, etc.)
timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
# Save alert to file
with open('compliance_alerts.log', 'a') as f:
f.write(f"{timestamp} - {message}n")
def analyze_compliance(vault_details):
"""Analyze backup vault compliance"""
total_vaults = len(vault_details)
if total_vaults == 0:
print("nNo vaults found to analyze")
return
unlocked_vaults = len([v for v in vault_details
if not v['Locked']])
# Configure your compliance thresholds
COMPLIANCE_THRESHOLD = 0.95 # 95% vaults should be locked
locked_percentage = (total_vaults - unlocked_vaults) / total_vaults
print(f"nCompliance Analysis:")
print(f"{'='*20}")
print(f"Total Vaults: {total_vaults}")
print(f"Locked Vaults: {total_vaults - unlocked_vaults}")
print(f"Unlocked Vaults: {unlocked_vaults}")
print(f"Locked Percentage: {locked_percentage:.2%}")
print(f"Compliance Threshold: {COMPLIANCE_THRESHOLD:.2%}")
if locked_percentage < COMPLIANCE_THRESHOLD:
send_alert(f"Compliance threshold inadvertent access detected. Only {locked_percentage:.2%} of vaults are locked (threshold: {COMPLIANCE_THRESHOLD:.2%})")
Regional customization
You can customize which Regions to scan based on your organization’s geographic footprint and compliance requirements. You can include specific Regions or exclude Regions where you don’t operate, optimizing the script’s performance and relevance to your environment. Replace the variables in the following code with your own information:
def get_all_regions():
# Define your region preferences
INCLUDED_REGIONS = ['us-east-1', 'eu-west-1', 'ap-southeast-1'] # Only scan these regions
EXCLUDED_REGIONS = ['ap-south-2', 'me-central-1'] # Skip these regions
# Get all available AWS regions
ec2_client = boto3.client('ec2')
all_regions = [r['RegionName'] for r in
ec2_client.describe_regions()['Regions']]
if INCLUDED_REGIONS:
# Only process specifically included regions
regions = [r for r in all_regions if r in INCLUDED_REGIONS]
else:
# Process all regions except excluded ones
regions = [r for r in all_regions if r not in EXCLUDED_REGIONS]
return regions
Clean up
This solution doesn’t create persistent AWS resources. To clean up, complete the following steps:
- Remove the IAM roles if no longer needed.
- Delete the generated report files.
Conclusion
In this post, you learned how to implement centralized monitoring of AWS Backup Vault Lock status across your organization. This solution helps you maintain consistent backup protection policies and quickly identify gaps in your backup security posture.
Implementing vault locks is a critical step in preventing unauthorized or accidental deletion of your backup data, and this monitoring solution helps promote consistent application of these security controls across your AWS environment.
For organizations seeking to enhance their backup security strategy further, consider exploring related AWS capabilities such as AWS Backup Audit Manager, which provides specific compliance controls, including having resources in a backup plan with AWS Backup Vault Lock and having resources inside logically air-gapped vaults. With these features, you can systematically verify backup compliance and implement additional protection layers against sophisticated threats, including ransomware and insider events.
To learn more about AWS Backup and vault locks, refer to the AWS Backup Developer Guide. You can also explore the related posts Streamline and automate compliance monitoring using AWS Backup Audit Manager and Building cyber resiliency with AWS Backup logically air-gapped vault.
About the authors
Somnath Chatterjee is an accomplished Senior Technical Account Manager at Amazon Web Services (AWS), Somnath is dedicated to guiding customers in crafting and implementing their cloud solutions on AWS. He collaborates strategically with customers to help them run cost-optimized and resilient workloads in the cloud. Beyond his primary role, Somnath holds specialization in the compute technical field community. He holds SAP on AWS Specialty certification and EFS SME. With over 14 years of experience in the information technology industry, he excels in cloud architecture and helps customers achieve their desired outcomes on AWS.
Sumit Bhardwaj is a Sr. Technical Account Manager at Amazon Web Services (AWS) with over 10 years of industry experience. He is a technology enthusiast and enjoys tackling complex challenges and building flawless cloud operations for AWS customers. Beyond his professional pursuits, Sumit enjoys reading books during his free time. He’s also an avid traveler, always eager to explore new places.
Manish Kaushik is a Cloud Support Engineer at Amazon Web Services (AWS), based in India, with a strong passion for cloud technologies. He specializes in providing expert support for AWS storage services, efficiently helping customers troubleshoot and resolve their issues. As a subject matter expert in AWS DataSync and Amazon S3, with over 5 years of experience, Manish plays a crucial role in empowering support engineering teams to deliver exceptional customer service.