Serverless Architecture on AWS

Kevin Kiruri
10 min readFeb 14, 2024

--

Introduction

Let’s assume you want to receive notifications regularly based on some event occurrences. This is the case in this demo walk through. Assume you have stores in various locations. At the end of the day, you would like to know the products and store locations where the inventory is below 5 every day. This serverless infrastructure can adequately send you notifications for your knowledge. Applications for this use case would be endless in business. To put the icing on the case, there is a CloudFormation yaml template that creates the infrastructure in the click of a button.

Prerequisites

  1. Have an AWS account. If you don’t have one, sign up here and enjoy the benefits of the Free-Tier Account
  2. View project files in my GitHub portfolio

Creating an S3 Bucket

  1. On the AWS console, search for S3 and click on it to open the S3 Console Page
  2. Under Buckets click on Create bucket

3. Enter your preferred AWS Region and Bucket name. Remember that a bucket name should be unique across the entire region. Scroll down and click on Create bucket . Note the ARN of the created bucket. We will be using it later

Create a Dynamo Db Table

  1. Search for Dynamo DB on the Services search box. Once opened, click on Create table

2. Enter the Table name , Partition key and Sort Key . Scroll to the bottom and click on Create table

3. The table is created and appears on your list of tables. Note the DynamoDB ARN of the table. We will use it later.

Create an SNS Topic

  1. In the Amazon SNS dashboard, under the Topics page, click on Create topic

2. Select the Standard type, then give your preferred name for the SNS Topic. Scroll down and click on Create topic

3. Once created, on the Subscriptions tab, click on Create subscription

4. In the Create subscription page, set the Protocol as Email and enter your preferred email address as the Endpoint . Scroll down and click on Create subscription

5. You will receive an email asking you to confirm the subscription. Click on Confirm subscription .

You should then get a Subscription confirmed notification

Creating a Role for Lambda to read from S3 and update the DynamoDB

  1. In the IAM Console , under Policies click on Create policy

2. Select the json tab of the policy and paste the json policy below

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"{S3-Bucket ARN}/*",
"{S3-Bucket ARN}"
]
},
{
"Effect": "Allow",
"Action": [
"dynamodb:PutItem",
"dynamodb:BatchWriteItem"
],
"Resource": "{DynamoDB ARN}"
}
]
}

Replace {S3-Bucket ARN} with the ARN of the S3 bucket you created and {DynamoDB ARN} with the ARN of your DynamoDB Table

3. Scroll to the bottom and click on Next

4. Give the policy a name then scroll down and click on Create policy

5. On the IAM Console click on Roles then Create role

6. Under Trusted entity type , select AWS service , select Lambda as the Use case then click on Next

7. In the Add permissions page, search for the policy we just created, select the check box then click on Next

8. Give the Role a name, than scroll down and click on Create role

Creating a Role for Lambda to read from DynamoDB and send an SNS Notification

  1. Follow the steps followed in creating the previous role but use the JSON policy below
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"dynamodb:PutItem",
"dynamodb:UpdateItem",
"dynamodb:GetItem",
"dynamodb:Scan",
"dynamodb:Query"
],
"Resource": "{DynamoDB Table ARN}"
},
{
"Effect": "Allow",
"Action": [
"dynamodb:GetRecords",
"dynamodb:GetShardIterator",
"dynamodb:DescribeStream",
"dynamodb:ListStreams"
],
"Resource": "{DynamoDB Table ARN}/Stream/*"
},
{
"Effect": "Allow",
"Action": "sns:Publish",
"Resource": "{SNS Topic ARN}"
}
]
}

Insert your Dynamo DB ARN and the SNS Topic ARN in the respective positions in the policy above

Create a Lambda Function to Read from S3 and Upload to Dynamo DB

  1. Search for Lambda and open the Lambda Console. Click on Create function

2. Under Create function, choose Author from scratch, enter your preferred Function name and Runtime . Scroll down and click on Create function

3. In the Code source section, enter the code below and click on Deploy

import json
import boto3
import csv
import os

def lambda_handler(event, context):

# Get the S3 bucket and object information from the event
s3_bucket = event['Records'][0]['s3']['bucket']['name']
s3_key = event['Records'][0]['s3']['object']['key']

# Test lambda function manually
# s3_bucket = 'productinventory240215'
# s3_key = 'inventory.csv'

# Set the DynamoDB table name
dynamo_table_name = 'AppleInventoryTable'

# Create a DynamoDB client
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table(dynamo_table_name)

# Create an S3 client
s3 = boto3.client('s3')

# Download the CSV file from S3
temp_file_path = '/tmp/temp.csv'
s3.download_file(s3_bucket, s3_key, temp_file_path)

# Read and process the CSV file, skipping the header row
with open(temp_file_path, 'r') as csvfile:
csvreader = csv.reader(csvfile)
# Skip the header row
next(csvreader, None)

for row in csvreader:
store = row[0]
product = row[1]
count = int(row[2])

# Update DynamoDB table
table.put_item(
Item={
'StoreRegion': store,
'Product': product,
'ProductCount': count
}
)

# Clean up the temporary file
os.remove(temp_file_path)

return {
'statusCode': 200,
'body': 'CSV processing complete'
}

4. Unser the Configuration tab, select Triggers then click on Add trigger

5. Select the Source as S3 . Select the Bucket you created. For Event types use All object create events. Add .csv as the Suffix. (We will be uploading csv files with the data). Select the Check box then click on Add

6. Under the Permissions select Edit

7. Scroll down. Increase the Timeout to 5 minutes . This will help deal with large files. Under Execution role select Use an existing role then select the role we created to read data from S3 and upload to DynamoDB. Click on Save

Create a Lambda Function to Read from DynamoDB and send an SNS Notification for Low Inventory

  1. Navigate to the Lambda Console page and click on Create function

2. Select Author from scratch , give the function a name and select the Python runtime. Scroll down and click on Create function

3. Under the code tab, insert the Python code below then click on Deploy

import boto3
import json
import os

def lambda_handler(event, context):
# Define DynamoDB and SNS clients
dynamo_table_name = 'AppleInventoryTable'
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table(dynamo_table_name)
sns = boto3.client('sns')
region = os.environ['AWS_REGION']
account_id = context.invoked_function_arn.split(":")[4]

# Scan DynamoDB table to get items with count of 5 or less than 5
response = table.scan(
FilterExpression=boto3.dynamodb.conditions.Attr('ProductCount').lt(6)
)

# Check if there are items with count less than 5
items = response.get('Items', [])

if items:
# Prepare notification message
notification_message = "Low inventory alert:\n"
for item in items:
notification_message += f"StoreRegion: {item['StoreRegion']}, Item: {item['Product']}, Count: {item['ProductCount']}\n"

# Publish notification to SNS
sns.publish(
# TopicArn='arn:aws:sns:us-east-1:471285348599:LowInventoryTopic',
TopicArn = f'arn:aws:sns:{region}:{account_id}:LowInventoryTopic',
Message=notification_message,
Subject='Low Inventory Alert'
)

return {
'statusCode': 200,
'body': 'Notification sent successfully.'
}
else:
return {
'statusCode': 200,
'body': 'No items with count less than 5.'
}

4. Under the Configuration tab, select Permissions then click on Edit

5. On the Edit basic settings page, scroll down and set the timeout at 5 min . Under Execution role, select Use an existing role and select the role you created to read from DynamoDB and send a notification of low inventory. Scroll to the bottom and click on Save

6. Under the Configuration tab, select Triggers then click on Add trigger

5. Select DynamoDB as the Source. Select the DynamoDB Table we created earlier . Check the box for Activate trigger. Enter 1,000 as the Batch size then scroll down and click on Add

Testing

  1. Upload a csv file with 3 columns for Store, Product and Count. You can get a sample file here
  2. Navigate to S3 and select the bucket we created
  3. Click on Upload

4. Click on Add files then navigate to your files and select the required file. Click on Upload at the bottom

5. Once uploaded, the lambda function should upload the items to DynamoDB

6. An email should also be sent, showing the Products and Stores with less than 5 items in inventory

7. This ensures that the system works end to end.

Clean Up

  1. Delete the uploaded file from the S3 bucket
  2. Delete the S3 bucket
  3. Delete the SNS Topic
  4. Delete the Dynamo DB table
  5. Delete the Lambda functions
  6. Delete the IAM roles created
  7. Delete the IAM policies created

Infrastructure as Code

The beauty of setting up infrastructure on AWS is that we have access to tools used for Infrastructure as Code (IaC).

  1. Navigate to CloudFormation on the AWS console. Under Stacks click on Create stack

2. Select Template is ready and Upload a template file so that we can upload our yaml template. You can get the yaml template in the GitHub repo. Click Next

3. Provide a Stack name , the notification email and your preferred s3 bucket name

4. click on Next on the next 2 pages then click on Submit at the bottom. Give time for the resources to create. You can monitor the creation of resources on the Resources tab. Once complete, the status will show CREATE_COMPLETE . In the event of a ROLL_BACK check to confirm what the error is and correct it

5. Confirm subscription to the SNS topic from your email

6. Proceed to the Testing section as done in the manual setup

Clean Up

  1. Once done, click on Delete on the CloudFormation template page. Theis will delete all the resources created

2. On the pop up that appears, click on Delete and the infrastructure will be pulled down

Conclusion

This demo steps give a walkthrough on how to create a serverless on AWS. You could customize the configurations to fit your use case.

Follow for more demos and networking.
Kevin Kiruri LinkedIn

--

--

No responses yet