Monday, July 21, 2025

Amazon S3 Events Sent to Multiple SQS Queues | A combination of S3 → SNS → SQS Fan-out.

To send Amazon S3 events to multiple SQS queues, twtech can combine S3 → SNS → SQS fan-out. Direct S3-to-multiple-SQS isn’t supported, but SNS acts as a scalable intermediary.

Architecture: S3 → SNS → Multiple SQS Queues

# less

    [S3 Bucket]

         |

     (Event)

         v

    [SNS Topic]

       /   |   \

      v    v    v

[SQS A] [SQS B] [SQS C]

 Step-by-Step Guide

1. Create an SNS Topic

# bash

aws sns create-topic --name s3-event-topic

2. Create SQS Queues

# bash

aws sqs create-queue --queue-name queue-a

aws sqs create-queue --queue-name queue-b

3. Subscribe SQS Queues to SNS Topic

Get the ARNs:

# bash

aws sns subscribe --topic-arn <sns-topic-arn> \

  --protocol sqs --notification-endpoint <sqs-queue-a-arn>

aws sns subscribe --topic-arn <sns-topic-arn> \

  --protocol sqs --notification-endpoint <sqs-queue-b-arn>

 Make sure each SQS queue's access policy allows the SNS topic to send messages.

4. Attach S3 Event Notification to SNS Topic

Go to S3 → Properties → Event Notifications and set:

  • Event types (e.g., s3:ObjectCreated:*)
  • Destination: SNS Topic

Or use AWS CLI:

# bash

aws s3api put-bucket-notification-configuration \

  --bucket twtech-s3bucket \

  --notification-configuration '{

    "TopicConfigurations": [{

      "TopicArn": "arn:aws:sns:region:account-id:s3-event-topic",

      "Events": ["s3:ObjectCreated:*"]

    }]

  }'

 Permissions Summary

  • S3 bucket needs permission to publish to the SNS topic.
  • SNS topic must have access to publish to SQS queues.
  • Update SQS queue policy like:

# json

{

  "Version": "2012-10-17",

  "Statement": [{

    "Effect": "Allow",

    "Principal": {"Service": "sns.amazonaws.com"},

    "Action": "sqs:SendMessage",

    "Resource": "arn:aws:sqs:region:account-id:queue-name",

    "Condition": {

      "ArnEquals": {"aws:SourceArn": "arn:aws:sns:region:account-id:s3-event-topic"}

    }

  }]

}

 Example Use Case

When a new file is uploaded to an S3 bucket:

  • Queue A → Triggers media processing.
  • Queue B → Triggers metadata extraction.
  • Queue C → Triggers auditing/logging.

No comments:

Post a Comment

Kubernetes Clusters | Upstream Vs Downstream.

  The terms "upstream" and "downstream" in the context of Kubernetes clusters often refer to the direction of code fl...