Cloud Native Three Tier Application

Note

This is a 201 Level Lab which assumes basic knowledge of AWS environment setup. As such the instructions are more high level rather than specifying every step to be taken in the console.
If in doubt run through the labs on a 3 Tier Web Setup and S3 Storage and come back to this one.
In the production site we will have the ability to ensure people have completed earlier labs to unlock more advanced concepts.


In this lab we will look at refactoring a traditional three tier application as more of a cloud native service - with a particular focus on both High Availability and aligning cost to usage volume

The key steps we will take in this lab are;

  • Rebuild a server hosted website as a static website on the AWS S3 object storage service
  • As a demo look at custom domain names, using this site as an example
  • Look at some of the more sophisticated tools you can now use with static sites
  • Build a DynamoDB database to hold the transactions list
  • Introduce Lambda and the API gateway
  • Develop a Python script on an EC2 instance to build a webpage from DynamoDB
  • Look at how we build this in to our static website using dynamic HTML
  • Port our EC2 Python script to be a Lambda serverless function
  • Link our S3 static website to the API gateway to Lambda and DynamoDB
  • Compare the costs and availability of this architecture vs our 3 tier web
  • Review our architecture diagrams

Creating a Static Website on S3


Because S3 serves every object over the HTTP protocol, it makes an excellent resource for serving static websites.

Sites can be sets of linked static pages, like this one, or you can use embedded dynamic elements or client side Javascript to run more complex dynamic elements in the browser.

In this lab we are going to mix static S3 content with dynamic content served from serverless functions backed with a NoSQL database

Website Setup


In a previous module we created two buckets. S3 buckets, "...-oxford-internal-files" and "...oxford-public-files". We are going to use the "...oxford-public-files" bucket to host our static website. If you didn't complete this lab go to the Storage and S3 Lab and work through just the S3 bucket section. (note this would will be an #include file soon)

Log back in to the AWS console and go to the S3 console. Select your "...oxford-public-access" bucket and then select the "Permissions" tab.

In the "Block public access (bucket settings)" section, click on "Edit" and ensure every check box is unchecked (as below) then "Save changes"


publicaccesss3.png

To allow full public access to every object in this bucket (thus allowing it to be a public website) we have to also add a bucket policy to the bucket

Still in the Permissions tab you should see a section called Bucket Policy, click on Edit and enter the following policy to allow anonymous access to the bucket. NOTE : You must replace the String "Bucket-Name" with the specific name of your bucket e.g. "alistair-oxford-public-bucket".

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": [
                "s3:GetObject"
            ],

Finally, we need to specifically enable the bucket as a public website. Still on the S3 management console page for your bucket, go to the "Properties" tab.

Scroll down to "Static website hosting" and click "Edit". Change the setting for "Static website hosting" from "disable" to "enable". You can leave all the setting as their default settings, we will use "index.html" as the default homepage and "error.html" as the error page but note you do have to type these in to confirm. We don't need to add any redirect rules. Click on "Save changes".

If you go back to the bucket properties page you should see the static website is now described as below;

staticwebsitehosting.png

Note

If you click on the Bucket website endpoint you will most likely get a "404 Not Found" error page. To fix this we need to upload an "index.html" page.


The easiest way to set this up is to create an initial "index.html" file on your laptop. Using the text editor of your choice edit the very simple index.html file and save it, we can reuse the file from the first lab copied below


<HTML>
        <HEAD>
                <TITLE>CLO - Internet Banking Test Site</TITLE>
        </HEAD>
        <BODY>
                <H2>Online Banking</H2>
                <H3>Transactions March 2025</H3>
                <TABLE BORDER=2 CELLSPACING=5 CELLPADDING=5>
                        <TR>

Now in the S3 console, go to the bucket and select the "Objects" tab. Click on "Upload" and select and upload your "index.html" file

You should now see your (very minimalist) homepage is there.

Setting Up Dynamo DB


We are now going to set up a very simple table for our transactions in Dynamo DB, AWS's highly available NoSQL database.
Like S3, Dynamo DB is automatically replicated across three availability zones in a region and supports auto healing, such that should an Availability Zone fail traffic will be automatically routed to healthy capacity in another availability zone.
Dynamo DB also mirrors S3 in its consumption model is usage based so it is very useful for price per transaction architectures.

NoSQL databases are a fascinating topic and worthy of a course of their own. However, for the purpose of this lab we are going to simply set up a very simple NoSQL database table to demonstrate core concepts around service availability.

To set up Dynamo DB, in the AWS Console go to "DynamoDB"

In the left hand menu select "Tables" then Click on "Create Table".

Under "Table details", for the table name enter "oxford-banking-transactions".

For the partition key we will call this "description" and leave it as a "string" and for the "Sort key" we will call this "value" and change the type to a "number".

For all the other settings we will leave them unchanged, so Table Setting should have "Default Settings" checked and we don't need to add any additional tags

Click "Create table"

You will now see that our table has been created but at present it has no data in it. We could import data from a file or use a script but for now use the console to add data as follows (note you will need to add the Sequence Number Value manually each time for now).


Sequence Number Description Value
1 Transport for London 6.70
2 Pret a Manger 9.40
3 Cancer Research UK 10.00
4 Black Sheep Coffee 4.25
5 Thames Water 53.24
6 Amazon UK 32.45
7 Transport for London 4.30
8 ATM 100.00
9 Boisdale of Canary Wharf 40.40
10 O2 20.36
11 Justgiving 101.00
12 Mr Fox 49.00

S3 and Dynamo DB Testing


Clicking the button below will test your work so far, including the S3 setup and Dynamo DB creation.

Test your build
  • Testing S3 Buckets
  • Testing S3 Bucket Public Access Policy
  • Testing S3 public website homepage response
  • Testing Dynamo DB Initial Setup
  • Testing Dynamo DB Table

Linking EC2 to DynamoDB


No we have Dynamo DB set up with some sample data, we are going to write a script to retrieve the transactions and display them on our website

As this is now our more advanced lab session rather than describe every step in detail, the lab notes will describe the things we need to create rather than every step that needs to be taken. In each case we will be building on a step we already took in the labs so if in doubt you can refer back to an earlier lab or the AWS guides.

First we need to build a VPC you can have more than one VPC in a region so don't worry if the "oxford-course-vpc" is already set up in the console.

Think of a new naming convention for your new environment to distinguish it form the oxford-course-service format we have been using up to now, or alternatively just add a suffix like "lab 4" or "v2" in the tag structure.

Create a new VPC and name it according to your new structure. VPCs can have overlapping IP addresses so you could reuse 10.0.0.0/16, or to differentiate it you might want to use 10.1.0.0/16, for example.

In the VPC set up a public webserver subnet. This is only going to have one simple server for testing so you could create it as a /28 network (59 available IP addresses) or a /24 network (251 available IPs). The subnet can be in any availability zone but it should be public so an Internet Gateway will be created.

Once the VPC and subnet are created check the route table for the subnet. It should have a local route for the CIDR range of the VPC and a route for 0.0.0.0/0 to the Internet Gateway (this will begin igw-...)

Once this is done we need a new security group for the test instance. This will need to allow ssh traffic in, therefore allow port 22 inbound from anywhere. It should also allow HTTP (port 80) traffic inbound from anywhere for the test webserver.

For outbound traffic we should allow HTTPS and also HTTP so we can make connections out via the Internet Gateway to the Dynamo DB API endpoints over the Internet and also objects on S3.

Network Testing


As the instructions in this lab are more high level, clicking the button below will test your network setup

Test your build
  • Testing for a VPC
  • Testing VPC tags
  • Testing the Subnet Setup
  • Testing Route Table
  • Testing Network Security Groups

Setting up a Test Webserver Instance

With the pre work done, we can set up a new instance in the subnet. This is going to display a webpage based on a Python Script which connects to our DynamoDB table. Therefore we are going to need to attach a role to the instance to allow it to read from Dynamo.

In the IAM console create a new role for the instance to read Dynamo DB, again using your naming convention. Attach a policy to the role and in this case we are going to attach the AWS Managed Policy for Dynamo DB read and write access. Save it for use with our new instance

For this test we are only going to set up one server for the purpose of some script testing so we are not going to distinguish between a bastion or jump host and the test webserver, we are just going to allow ssh access to the test webserver. This is acceptable for a short lived instance in a pure development environment but in production use a bastion host or a service such as AWS system manager.

Now we can create the test EC2 instance. Create it as before from the Webserver AMI as the installed tools and configuration edits will be useful. Give it a public IP address in the network settings and attach it to our newly created security group. Assign the instance IAM role we created for Dynamo access to the server. We can use the bastion ssh key. Check all looks OK, it should be a t2.micro instance from our oxford-webserver-ami image running Amazon Linux.

Webserver Testing

We can now test this server setup

Test your build
  • Testing Webserver Instance Configuration
  • Testing Attached Security Group
  • Testing IAM Role
  • Testing IAM Role Association

Once the instance has started we can make a note of its public IP address. You will need to edit the entry for the bastion host in your ".ssh/config" file. However, as an alternative you can also investigate the "Cloud Shell" available in the AWS console for an interactive shell within the web console.

Log in to your server. to test the routing and Dynamo DB access, run the command "aws dynamodb list-tables". If all is well you should see the table you created earlier listed. If not check the role permissions and that you attached it to the instance, the instance has a security group which allows https outbound and that the subnet has a route to the Internet Gateway.

To access Dynamo DB from the instance we are going to use the Python Boto3 Library. Boto is the most frequently used Python library to access all AWS resources. To install it I'd recommend using "sudo pip install boto3", it will complain about installing python libraries as root but this will work well for our demo environment.

Now you can write a script to list the transactions. If you have a look at these examples you should be able to find a way to search and retrieve the data - https://www.fernandomc.com/posts/ten-examples-of-getting-data-from-dynamodb-with-python-and-boto3/.

I'd recommend creating the script in the "var/www/cgi-bin/" directory, and remember you will need to use sudo to edit the file as root, and chmod 755 before you can execute it. Call it "dynamotransactions.py".

if you want to have a go at creating a working script to create the HTML table, please go ahead. However, the Dynamo syntax can be complex and is quite different to SQL, so I have included a working example below. I have also included some HTML style descriptions, just so our output looks a little clearer than the default;

#!/usr/bin/env python3
import boto3
import re
thisdescription = [0] * 1024
thisvalue = [0] * 1024

dynamodb_client = boto3.client("dynamodb", region_name='eu-west-1')

# Specify the table name to scan
table_name = 'oxford-banking-transactions'

... (95 more lines)

As well as using the Boto3 module for AWS access, this script uses the built in "re" module for regular expression matching

We use two lists. this description and this value for the descriptions and values of the transactions. Becase we are going to insert the values in places outside the next value they are pre created with 1024 blank values, again this may not be best coding practice but works for today's demo.

We specify the dynamodb client with a region and the name of the table we will scan, this may be the one thing you need to change (although unlike S3 buckets every account can reuse the same Dynamo table names).

We now scan the table. This reads every entry in the table and brings back each attribute with its type and value on a line per primary key. This is very inefficient at large scale, generally you would specify a unique primary key value or range. However for this purpose we are fine for a simple demo.

Each line returned takes the form;

{'description': {'S': 'Pret a Manger'}, 'value': {'N': '9.4'}, 'Sequence Number': {'N': '3'}}

This is the attribute name, its type and value. By using the "m = re.findall(r"'([^']*)'", str(item))" we cut every item returned between single quotes and turn them into a list value in the list m. (The re module in very useful in Python, it's not quite as powerful as Perl pattern matching but works well for string manipulation).

The loop takes the value of each of the attributes and outputs them in a HTML table. There is some conversion between floating point numbers and strings along the way which there may be a way to avoid with a more sophisticated database query but this is a demo for a simple setup.

If you now go to your web browser and go to "http://(IP address of your EC2 instance)/cgi-bin/dynamotransactions.py" you should see the table of the transaction list from the Dynamo DB Database (as always substituting the current public IP address of your running instance).

Dynamic Content from S3


So far we have created a scaling static website which is managed by AWS and a multi availability zone No SQL database which again is managed by AWS. To test our Python script we have initially deployed it on a server running as a single EC2 instance for testing purposes. But now we need to link them together to create a cohesive website

To do this we will change the static content in S3 to include a HTML IFrame, which is an "Inline Frame" which allows the list of transactions from the dynamic function to be displayed in the static page as served from S3

On your laptop, using a text editor, create the following html file which we can call "dynamotransactions.html"

<html><head>
<title>Serverless Transactions List</title>
<style type="text/css">

/* grid container */
.transactions-grid {
    display:grid;
    grid-template-areas:
        'header'
        'main-content'

... (69 more lines)

Note :

Replace "(IP address of your EC2 instance)" in the body of the page above with the IP address of your EC2 instance.


If you go to your web browser and open the file, should now have a local web page which looks like this;


firststatics3.png

If you feel that the colours in the page are too much, feel free to replace them with "background:#fff;" to change them to white, or make them more vivid if you prefer. At the moment our bank is really going for customer who also likes Piet Mondrian.


The CSS in this layout is based on Art and Design by Matthew James Taylor, iFrame guidance from the ever useful w3schools.


So now we have a static webpage with dynamic content embedded in it, which is the basis for most interactive websites on the web today. However, although the S3 web hosting and DynamoDB services are based on pay as you use, the EC2 service still has a comparatively high standing charge associated with it (yes the headline charge is under $0.01 per hour, but we need multiple instances over multiple availability zones for redundancy, and add load balancer instances and the standing costs rise quite quickly)


Therefore we will look to replace our EC2 instance with the final part of our architecture, Lambda serverless functions


Lambda Serverless Functions

All the major public cloud providers offer a variation of serverless function hosting. Typically you package a block of code for one of the supported language specific runtimes and the cloud service provider (CSP) then executes it on demand. Generally the costs are by volume of code stored and CPU cycles used in code execution but even for quite complex functions it is cheaper to run the as hosted functions for thousands of invocations per hour rather than use standalone virtual machine instances. In addition the CSP also manages the scale out of the execution environment to cope with peaks and troughs in demand, with the serverless environment far faster to respond than traditional EC2 hosting.

Below are some useful links to Lambda function documentation and guides;

https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-dynamo-db.html

https://aws.amazon.com/tutorials/run-serverless-code/

https://tonyteaches.tech/aws-lambda-website/

We will create a serverless function and the API gateway which provides access from the public Internet to our serverless function.

We will use the code we tested in our EC2 instance above, but we need to modify it for the Python function. Rather than printing the HTML we need to modify the script so it is returned as a string.

In the AWS console, search for an go to the Lambda console. Select "Functions" in the left hand menu.

In the Create Function, start by selecting "Author from Scratch - Start with a simple Hello World example" in the top radio boxes.

For the function name enter "dynamodb-transactions-list"

For runtime select the latest Python version, at the time of writing this is "Python 3.13"

For architecture choose "arm64", unless you need to use language modules which are X86 specific then ARM will almost always offer better price performance.

Under Default execution role, choose "Use an existing role" then select the "service-role/oxford-lambda-dynamodb-access" role you created earlier.

Under Additional Configuration we do not need to select any of the options now, but it is worth looking at them as they are all useful services for different architectures and release mechanisms.

We can now go ahead an select "Create function"

This takes us to a integrated code editor. By default this is pre-populated with a "Hello from Lambda!" function.

Delete the pre-populated code and copy and paste the code below.

Warning

In the example below indentation is important, make sure it is preserved when you copy to the Lambda console.
You may also find the right click copy and paste command in your browser don't work, but CTRL/CMD c then v do.

import json
import boto3
import re

def lambda_handler(event, context):

	thisdescription = [0] * 1024
	thisvalue = [0] * 1024

	dynamodb_client = boto3.client("dynamodb", region_name='eu-west-1')

... (89 more lines)

Once you have entered your code in the "Code Source" page, click on "Deploy" to upload the lambda code then "Test" to run it for the first time. When you hit "Test" you should see you HTML page plus log output of the form as below;

If it prompts you for a test, you can simply take the default and give it a name like "Click"

Status: Succeeded
Test Event Name: Click

Response:
{
  "statusCode": 200,
  "headers": {
    "Content-Type": "text/html"
  },
  "body": "\n\t<html><head><title>Transactions from DynamoDB</title>\n\t<style type=\"text/css\">\n\ttable.blueTable {\n\tborder: 1px solid #1C6EA4;\n\tbackground-color: #EEEEEE;\n\twidth: 100%;\n\ttext-align: left;\n\t}\n\ttable.blueTable td, table.blueTable th {\n\tborder: 2px solid #AAAAAA;\n\tpadding: 4px 4px;\n\t}\n\ttable.blueTable tbody td {\n\tfont-size: 18px;\n\t}\n\ttable.blueTable tr:nth-child(even) {\n\tbackground: #D0E4F5;\n\t}\n\ttable.blueTable thead {\n\tbackground: #1C6EA4;\n\tbackground: -moz-linear-gradient(top, #5592bb 0%, #327cad 66%, #1C6EA4 100%);\n\tbackground: -webkit-linear-gradient(top, #5592bb 0%, #327cad 66%, #1C6EA4 100%);\n\tbackground: linear-gradient(to bottom, #5592bb 0%, #327cad 66%, #1C6EA4 100%);\n\tborder-bottom: 2px solid #444444;\n\t}\n\ttable.blueTable thead th {\n\tfont-size: 15px;\n\tfont-weight: bold;\n\tcolor: #FFFFFF;\n\tborder-left: 2px solid #D0E4F5;\n\t}\n\ttable.blueTable thead th:first-child {\n\tborder-left: none;\n\t}\n\n\ttable.blueTable tfoot td {\n\tfont-size: 14px;\n\t}\n\ttable.blueTable tfoot .links {\n\ttext-align: right;\n\t}\n\ttable.blueTable tfoot .links a{\n\tdisplay: inline-block;\n\tbackground: #1C6EA4;\n\tcolor: #FFFFFF;\n\tpadding: 2px 8px;\n\tborder-radius: 5px;\n\t}\n\t</style>\n\t</head><body>\n\t<table class=\"blueTable\"><tr><th>Sequence Number</th><th>Description</th><th>Value</th></tr>\n\t<tr><td>1</td><td>Transport for London</td><td>6.70</td></tr>\n<tr><td>2</td><td>Cancer Research UK</td><td>10.00</td></tr>\n<tr><td>3</td><td>Pret a Manger</td><td>9.40</td></tr>\n<tr><td>4</td><td>Thames Water</td><td>53.24</td></tr>\n<tr><td>5</td><td>Amazon UK</td><td>32.45</td></tr>\n<tr><td>6</td><td>Transport for London</td><td>4.30</td></tr>\n<tr><td>7</td><td>ATM</td><td>100.00</td></tr>\n<tr><td>8</td><td>Boisdale of Canary Wharf</td><td>40.40</td></tr>\n<tr><td>9</td><td>O2</td><td>20.36</td></tr>\n<tr><td>10</td><td>Mr Fox</td><td>44.39</td></tr>\n<tr><td>11</td><td>Justgiving</td><td>101.00</td></tr>\n<tr><td>12</td><td>Kilburn Flowers</td><td>25.00</td></tr>\n</table></body></html>\n"

Any issues step through the debug steps, checking for any errors in copying the code (watch out for indentation errors).

Adding an API Gateway


We now have a hosted serverless function to generate our list of transactions, the next step is to link them to an API gateway to provide an endpoint to access the function from.

In the AWS Console, search for and go to the "API Gateway"

Where the Create API page says "Choose an API type", click on "Build"

Under "Integrations" select "Lambda", ensure the region "eu-west-1" is selected and for the function name choose the "dynamodb-transactions-list" function we created earlier in this lab. For version we can stick with the current version "2.0".

For our first API we are going to give it the name "oxford-dynamodb-lambda-request", enter this and click "Next".

Next we can select the routes which are supplied by the API gateway to the function. Routes in this context consist of two parts, a "HTTP method" e.g. "GET", "POST" or "PUT" and a "Resource Path" e.g. "/forms".

In this case we have a very simple function which at present does exactly one thing, so we are going to use the "GET" method with the path "/". Ensure that your dynamodb lambda function is selected as your Integration target. Click "Next".

At present our Lambda environment isn't sophisticated enough to need stages, which allow us to move code releases through different stages of production testing and acceptance. So we can leave "Define stages" as unchanged and click "Next" to move on to "Review and create".

This should look like the image below, if it all looks good click on "Create".

apigatewayreview.png

You should now see the name of your API in the laft hand menu, click to select it. If you don't, click on "APIs" at the top of the left hand menu and select it from the list there, then select it in the left hand menu (the layout of the API gateway console is slightly different to other AWS services). You should see a page like below;


apigatewayendpoint.png

If you click on the "Invoke URL", which in this example is https://jfczednq3a.execute-api.eu-west-1.amazonaws.com/, you will see a short delay and then your transaction table will be generated.

At this point we have all the components we need, a static S3 website (in multiple Availability Zones), an API gateway facing the Internet (also in multiple AZs), a serverless function (which fails over between multiple AZs) and finally a NoSQL Dynamo Database which is hosted and replicated across multiple AZs at all times.

Linking S3 to the Serverless Function


Now we can modify our transaction list homepage so that instead of pointing to our EC2 instance it points to the API gateway endpoint. Go to the local file you created earlier in the lab and find the section as below;

<header class="header">Serverless Transactions List</header>
    <main class="main-content"> 
    <iframe height=600px width=100% style="border:none;" src="http://[strong]
(IP address of your EC2 instance)[/strong]
/cgi-bin/dynamotransactions.py">
    </iframe> </main>
    <section class="left-sidebar"></section>
    <aside class="right-sidebar"></aside>

change this to;


<header class="header">Serverless Transactions List</header>
    <main class="main-content"> 
    <iframe height=600px width=100% style="border:none;" src="URL for your API endpoint">
    </iframe> </main>
    <section class="left-sidebar"></section>
    <aside class="right-sidebar"></aside>

In case case replace the URL for your API endpoint with the https endpoint for the API gateway we created in the previous step, in the example above this was https://jfczednq3a.execute-api.eu-west-1.amazonaws.com/.

Save the file locally and make sure it works by loading it as a local file in your web browser. If all looks good make sure you have a local copy saved as "index.html" as for the next step we are going to make it the homepage for our static S3 website.

Updating the S3 Website

In the AWS Console, go back to the S3 Console page. Select the bucket ending "....-oxford-public-access" that we created at the start of this lab.

In this bucket we should have a file called "index.html". Check this file and select the "Delete" button to delete it (you may be prompted to enter some text to confirm the delete, the console protects you from mistakes in a way the API doesn't)

Now select "Upload" and upload the "index.html" file we just created on the local computer. Once we have uploaded it, click on the "index.html" object to see its properties. We should then see that this object has a public URL which looks like "https://alistair-oxford-public-files.s3.eu-west-1.amazonaws.com/index.html". If we click on this link we should see our colourful homepage with the dynamic transactions list.

To prove it is dynamic we can add an additional item to the DynamoDB table

Go to the DynamoDB Console Homepage. In the left hand menu click on "Explore items", then select the table "oxford-banking-transactions". Because we have a single table with a small number of items you should have a small number (around 11) items listed.


dynamodboxfordtransactions.png


Make a note of the highest transaction number, in the example above it is 12. Now click on the "Create Item" button.

Click on "Add New Attribute" and give it the name "Sequence Number" and a type "number". Set the sequence number to a number one higher than the highest value used so far. For description type in a description of a spending item / shop and under value add a currency amount. Then click "Create item". When you refresh the S3 hosted homepage you should see the transactions list has been updated.

Most traditional SQL databases have integrated ways to create a variable as a counter but this is not a core function of DynamoDB. There are multiple ways to implement counters in DynamoDB and there is a good discussion about the pros and cons of the different approaches in this AWS Database Blog post - Implement resource counters with Amazon DynamoDB.

Testing

To confirm everything is running as expected you can click the testing button below;

Test your build
  • Testing Lambda Function Creation
  • Testing API Gateway Setup
  • Testing Lambda Function Response
  • Testing S3 Website to Lambda Integration

This concludes the main part of the lab, you can now terminate / delete the EC2 instance and delete the VPC. Once you have done this you can reload your S3 hosted website to demonstrate that it is running without any VPC hosted services.

For additional testing you can now run the following


  • Apply some load to the S3 / lambda / dynamo hosted infrastructure to see how it behaves under load compared to our EC2 and load balancer hosted instances from lab 2.
  • Draw up architecture diagrams to see how we represent services which run across multiple availability zones but outside of a VPC.

Further Experiments

If you finish early or want to experiment further here are a couple of ideas for experiments;

A dynamic HTML links page

Just like the links list at the top of this page (which is static), you could create a dynamic list of links based on a DynamoDB table. The table could hold a list of URLs, Link Texts and Descriptions and you could then output this in different formats. This should be passible by following a similar process to above just changing the output format for the HTML.

A DynamoDB update page

It would be useful if you could update the transactions list in Dynamo directly from a webpage. To do this you would need;

  • A HTML Webform Hosted on your S3 website to collect the three attributes to be updated - see HTML Forms on w3schools for a useful guide.
  • Either a new Lambda function or a new route to additional code in the existing function which take the form values as inputs and update the DynamoDB table.
  • An update to the API endpoint, or a new endpoint if you prefer, but with a post method which links the web form to the lambda function.

There is a useful guide to how to create the DynamoDB operations at https://docs.aws.amazon.com/apigateway/latest/developerguide/http-api-dynamo-db.html. It is well worth reading this even if you don't carry out the exercise.

A Lambda function to update S3


A different model of creating a dynamic website is not to retrieve the Dynamo table contents on every read, but to have a serverless function which reads Dynamo and writes out a static webpage on S3 which is then read by the client. This could be run a periodic intervals or even with the function being triggered by a DynamoDB table update.

If you wanted to look into this AWS have an interesting tutorial here - DynamoDB Streams and AWS Lambda triggers.

Conclusion

This concludes the lab. In this lab we have explored how to refactor a 3 tier web application to a more cloud native design using highly resilient and close to pay as you use services. We have begun to look at how NoSQL, serverless and object storage can combine for modern cloud applications and also explored how they would fit into a larger architecture.

For these architectures we have considered availability, performance, security, operational excellence, cost and sustainability and hopefully looked at how we model services and architectures before be deploy them and then document them post deployment.