poniedziałek, 6 października 2025

S3 cross account replication

I was moving AWS resources from one account to separate staging and production accounts. One of the steps was to migrate S3 buckets. A solution was cross account replication. Because S3 cross region replication moves only new files we have to create a S3 Batch Operation to move existing objects.

S3 cross account replication and Batch Operation

As following:

  • enable S3 bucket versioning on your buckets,
  • in source account prepare an IAM policy as a part of an IAM role to be used by the S3 replication:
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Action": [
                "s3:ListBucket",
                "s3:GetReplicationConfiguration",
                "s3:GetObjectVersionForReplication",
                "s3:GetObjectVersionAcl",
                "s3:GetObjectVersionTagging",
                "s3:GetObjectRetention",
                "s3:GetObjectLegalHold"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::SOURCE_BUCKET",
                "arn:aws:s3:::SOURCE_BUCKET/*",
                "arn:aws:s3:::TARGET_BUCKET",
                "arn:aws:s3:::TARGET_BUCKET/*"
            ]
        },
        {
            "Action": [
                "s3:ReplicateObject",
                "s3:ReplicateDelete",
                "s3:ReplicateTags",
                "s3:ObjectOwnerOverrideToBucketOwner"
            ],
            "Effect": "Allow",
            "Resource": [
                "arn:aws:s3:::SOURCE_BUCKET/*",
                "arn:aws:s3:::TARGET_BUCKET/*"
            ]
        }
    ]
}
  • in source account create an IAM role that includes above policy (trusted entity type = AWS service, Use case = S3),
  • in source account prepare an IAM role for a S3 Batch Operation (trusted entity type = AWS service, Use case = S3 Batch Operations):
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "GetSourceBucketConfiguration",
            "Effect": "Allow",
            "Action": [
                "s3:ListBucket",
                "s3:GetBucketLocation",
                "s3:GetBucketAcl",
                "s3:GetReplicationConfiguration",
                "s3:GetObjectVersionForReplication",
                "s3:GetObjectVersionAcl",
                "s3:GetObjectVersionTagging",
                "s3:PutInventoryConfiguration",
                "s3:GetInventoryConfiguration",
                "s3:PutObject",
                "s3:GetObject",
                "s3:InitiateReplication",
                "s3:AbortMultipartUpload"
            ],
            "Resource": [
                "arn:aws:s3:::SOURCE_BUCKET",
                "arn:aws:s3:::SOURCE_BUCKET/*"
            ]
        },
        {
            "Sid": "ReplicateToDestinationBuckets",
            "Effect": "Allow",
            "Action": [
                "s3:List*",
                "s3:*Object",
                "s3:ReplicateObject",
                "s3:ReplicateDelete",
                "s3:ReplicateTags"
            ],
            "Resource": [
                "arn:aws:s3:::TARGET_BUCKET",
                "arn:aws:s3:::TARGET_BUCKET/*"
            ]
        },
        {
            "Sid": "PermissionToOverrideBucketOwner",
            "Effect": "Allow",
            "Action": [
                "s3:ObjectOwnerOverrideToBucketOwner"
            ],
            "Resource": [
                "arn:aws:s3:::TARGET_BUCKET",
                "arn:aws:s3:::TARGET_BUCKET/*"
            ]
        }
    ]
}
  • in target account update S3 bucket policy:
{
    "Version": "2012-10-17" ,
    "Id": "",
    "Statement": [
        {
            "Sid": "Set-permissions-for-objects",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::SOURCE_ACCOUNT_NUMBER:role/service-role/REPLICATION_IAM_ROLE_NAME"
            },
            "Action": [
                "s3:ReplicateObject",
                "s3:ReplicateDelete"
            ],
            "Resource": "arn:aws:s3:::TARGET_BUCKET/*"
        },
        {
            "Sid": "Set permissions on bucket",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::SOURCE_ACCOUNT_NUMBER:role/service-role/REPLICATION_IAM_ROLE_NAME"
            },
            "Action": [
                "s3:GetBucketVersioning",
                "s3:PutBucketVersioning"
            ],
            "Resource": "arn:aws:s3:::TARGET_BUCKET"
        },
        {
            "Sid": "Permissions on objects and buckets",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::SOURCE_ACCOUNT_NUMBER:role/BATCH_OPERATIONS_IAM_ROLE_NAME"
            },
            "Action": [
                "s3:List*",
                "s3:GetBucketVersioning",
                "s3:PutBucketVersioning",
                "s3:ReplicateDelete",
                "s3:ReplicateObject"
            ],
            "Resource": [
                "arn:aws:s3:::TARGET_BUCKET",
                "arn:aws:s3:::TARGET_BUCKET/*"
            ]
        },
        {
            "Sid":"1",
            "Effect":"Allow",
            "Principal":{"AWS":"arn:aws:iam::SOURCE_ACCOUNT_NUMBER:role/service-role/REPLICATION_IAM_ROLE_NAME"},
            "Action":["s3:ObjectOwnerOverrideToBucketOwner"],
            "Resource":"arn:aws:s3:::TARGET_BUCKET/*"
        },
        {
            "Sid": "Permission to override bucket owner",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::SOURCE_ACCOUNT_NUMBER:role/BATCH_OPERATIONS_IAM_ROLE_NAME"
            },
            "Action": "s3:ObjectOwnerOverrideToBucketOwner",
            "Resource": "arn:aws:s3:::TARGET_BUCKET/*"
        }
    ]
}
  • go to your source S3 bucket, then "Management" bookmark and click on "Create replication rule":
    • give a name,
    • status = "Enabled",
    • role scope = "Apply to all objects in the bucket",
    • choose your destination bucket (mark "Change object ownership to destination bucket owner"),
    • choose your S3 replication IAM role,
    • mark "Change the storage class for the replicated objects with Standard storage class",
    • mark "Delete marker replication" as a additional replication option,
  • in your account go to "S3", open "Batch Operations" and push "Create job":
    • object list = "Generate an object list based on a replication configuration" (it will check S3 replication rule we created previously),
    • choose your source S3 bucket,
    • click "Next",
    • operation = "Replicate",
    • click "Next",
    • put a name,
    • unmark "Generate completion report",
    • choose your S3 Batch Operations IAM role,
    • click "Next",
    • check settings and click "Submit".

poniedziałek, 14 lipca 2025

Include Terraform dependency lock file

Why? Because in the beginning of an initialization it save modules and providers checksums. Thanks to this you can track if anything changed in the version you used.

Source: https://www.hashicorp.com/en/blog/terraform-security-5-foundational-practices

czwartek, 27 lutego 2025

A cost optimized AWS environment

Costs saving:

  • Saving Plans,
  • Reserved Instances,
  • change your default payment method to avoid currency conversion,
  • Spot Instances (a development environment),
  • Data Lifecycle Management for EBSes (remove unneeded EBSes),
  • S3:
    • a lifecycle policy for a bucket (move your data into a cheaper storage class),
  • use VPC endpoints (AWS charges for outbound data transfer),
  • use Graviton instance type,
  • use Lambda to switch off your instances (for example EC2, RDS) out of working hours on your development environments.
  • choose a right region because a resource can be cheaper in a different region,
  • Parameter Store instead of Secrets Manager if you don't need a versioning or rotation,
  • ElastiCache for Redis:
    • consider using ElastiCache for Valkey,
  • CloudWatch:
    • logs retention,
  • NAT Gateway:
  • Route 53:
    • check your records TTLs - the lower TTL the less you pay.

Monitoring:

  • Cost Explorer,
  • Cost and Usage Reports,
  • Cost Anomaly Detection,
  • Budgets,
  • Trusted Advisor,
  • cost allocation tags,
  • AWS Compute Optimizer,
  • S3 Storage Lens.