CASE STUDY OF AWS SQS(Simple Queue Service)

Prateek Mishra
6 min readMar 21, 2021

A warm welcome and lots of good wishes on becoming part of this article now without wasting the time ,we know about the what is AWS?? and what is SQS??

WHAT IS AWS ??

Amazon Web Services (AWS) is the world’s most comprehensive and broadly adopted cloud platform, offering over 200 fully featured services from data centers globally. Millions of customers — including the fastest-growing startups, largest enterprises, and leading government agencies — are using AWS to lower costs, become more agile, and innovate faster.

How It works ??

instead of buying, owning, and maintaining physical data centers and servers, you can access technology services, such as computing power, storage, and databases, on an as-needed basis from a cloud provider like Amazon Web Services (AWS).

What Is Service ??

Service are like a single unit of work which is specify by the AWS, it have multiple services more than 200+ services are available in AWS. there are some services given which is mostly used.

  • Service #1 — Amazon S3.
  • Service #2 — Amazon EC2 [Elastic Compute Cloud]
  • Service #3 — AWS Lambda.
  • Service #4 — Amazon Glacier.
  • Service #5 — Amazon SQS.
  • Service #6 — Amazon Cloud Front.
  • Service #7 — Amazon EBS [Elastic Block Store]
  • Service #8 — Amazon Kinesis.

AWS have a lot of services i cannot mention here all the services of AWS in this article you can go through this link for know about more https://aws.amazon.com/search/?searchQuery=services

I think now you are little bit familiar with AWS and it’s services. now below i would like to discuss more about AWS SQS service which is the prominent part of this article.

WHAT IS AWS SQS SERVICE ??

Amazon Simple Queue Service (SQS) is a fully managed message queuing service that enables you to decouple and scale microservices, distributed systems, and serverless applications. SQS eliminates the complexity and overhead associated with managing and operating message oriented middleware, and empowers developers to focus on differentiating work. Using SQS, you can send, store, and receive messages between software components at any volume, without losing messages or requiring other services to be available. Get started with SQS in minutes using the AWS console, Command Line Interface or SDK of your choice, and three simple commands.

SQS offers two types of message queues. Standard queues offer maximum throughput, best-effort ordering, and at-least-once delivery. SQS FIFO queues are designed to guarantee that messages are processed exactly once, in the exact order that they are sent.

FUNCTIONALITY:-

  1. Message locking: When a message is received, it becomes “locked” while being processed. This keeps other computers from processing the message simultaneously. If the message processing fails, the lock will expire and the message will be available again.
  2. It can store the message maximum for 14 days.
  3. Dead Letter Queues (DLQ): we set a limit in message queue after that limit if polling is did by the consumer then message will go inside the dead letter queue.
  4. Data Size: it can store the minimum size of data 2 KB and maximum 256 KB.
  5. Server-side encryption (SSE): Protect the contents of messages in Amazon SQS queues using keys managed in the AWS Key Management Service (AWS KMS). SSE encrypts messages as soon as Amazon SQS receives them. The messages are stored in encrypted form and Amazon SQS decrypts messages only when they are sent to an authorized consumer.

QUEUE TYPES

Two type of message queue we have,

1.Standard Queues:-

Unlimited Throughput: Standard queues support a nearly unlimited number of transactions per second (TPS) per API action.

At-Least-Once Delivery: A message is delivered at least once, but occasionally more than one copy of a message is delivered.

Best-Effort Ordering: Occasionally, messages might be delivered in an order different from which they were sent.

2.FIFO Queues:-

High Throughput: By default, FIFO queues support up to 300 messages per second (300 send, receive, or delete operations per second). When you batch 10 messages per operation (maximum), FIFO queues can support up to 3,000 messages per second. If you require higher throughput, you can enable high throughput mode for FIFO (offered as a Preview in select regions) on the Amazon SQS console, which will support up to 30,000 messages per second with batching, or up to 3,000 messages per second without batching.

Exactly-Once Processing: A message is delivered once and remains available until a consumer processes and deletes it. Duplicates aren’t introduced into the queue.

First-In-First-Out Delivery: The order in which messages are sent and received is strictly preserved (i.e. First-In-First-Out).

BENIFITS

  1. Message Reliability: Multiple copies of every message are stored redundantly across multiple availability zones so that they are available whenever needed.
  2. Secure Data: You can use Amazon SQS to exchange sensitive data between applications using server-side encryption (SSE) to encrypt each message body. Amazon SQS SSE integration with AWS Key Management Service (KMS) allows you to centrally manage the keys that protect SQS messages along with keys that protect your other AWS resources.
  3. Scalability: Amazon SQS leverages the AWS cloud to dynamically scale based on demand. SQS scales elastically with your application so you don’t have to worry about capacity planning and pre-provisioning.

CASE STUDIES

1. BMW (Bayerische Motoren Werke Aktiengesellschaft)

BMW

The BMW Group is using AWS for its new connected-car application that collects sensor data from BMW 7 Series cars to give drivers dynamically updated map information. BMW Group is one of the leading manufacturers of premium cars and mobility services in the world, with brands such as Rolls Royce, BMW, and Mini. BMW built its new car-as-a-sensor (CARASSO) service in only six months leveraging Amazon Simple Storage Service (Amazon S3), Amazon Simple Queue Service (Amazon SQS), Amazon DynamoDB, Amazon Relational Database Service (Amazon RDS), and AWS Elastic Beanstalk. By running on AWS, CARASSO can adapt to rapidly changing load requirements that can scale up and down by two orders of magnitude within 24 hours. By 2018 CARASSO is expected to process data collected by a fleet of 100,000 vehicles traveling more than eight billion kilometers.

2. NASA (National Aeronautics and Space Administration)

NASA

Architecture

The NASA Image and Video Library is a cloud-native solution, with the front-end web app separated from the backend API. It runs as immutable infrastructure in a fully automated environment, with all infrastructure defined in code to support continuous integration and continuous deployment (CI/CD).

In building the solution, ManTech International took advantage of the following AWS services:

Amazon Elastic Compute Cloud (Amazon EC2), which provides secure, resizable compute capacity in the cloud. This enables NASA to scale up under load and scale down during periods of inactivity to save money, and pay for only what it uses.

Elastic Load Balancing (ELB), which is used to distribute incoming traffic across multiple Amazon EC2 instances, as required to achieve redundancy and fault-tolerance.

Amazon Simple Storage Service (Amazon S3), which supports object storage for incoming (uploaded) media, metadata, and published assets.

Amazon Simple Queue Service (Amazon SQS), which is used to decouple incoming jobs from pipeline processes.

Amazon Relational Database Service (Amazon RDS), which is used for automatic synchronization and failover.

Amazon DynamoDB, a fast and flexible NoSQL database service, which is used to track incoming jobs, published assets, and users.

Amazon Elastic Transcoder, which is used to transcode audio and video to various resolutions.

Amazon Cloud Search, which is used to support searching by free text or fields.

Amazon Simple Notification Service (Amazon SNS), which is used to trigger the processing pipeline when new content is uploaded.

AWS Cloud Formation, which enables automated creation, updating, and destruction of AWS resources. ManTech International also used the Troposphere library, which enables the creation of objects via AWS Cloud Formation using Python instead of hand-coded JSON — each object representing one AWS resource such as an instance, an Elastic IP (EIP) address, or a security group.

  • Amazon Cloud Watch, which provides a monitoring service for AWS cloud resources and the applications running on AWS.

……………………….Thankyou…………………………………………………

--

--

Prateek Mishra

I am a tech enthusiast who thrives on experimenting with cutting-edge technologies