In this post, we will be creating a simple serverless API using the Serverless framework with AWS Lambda as our backend handler and DyanmoDB as the database. The Serverless framework comes with a variety of tools and utilities that help in deploying and managing serverless functions. The serverless framework automatically sets up the API Gateway, Lambda function, and a CloudFormation stack for us. This helps bootstrap development of a serverless API and functions.
Serverless API architecture

Our API for managing posts would need to create services and endpoints for performing the below operations.

GET

  • getAllPosts - Returns all posts in the DB
  • getPost - Returns one post by id

POST

  • createPost - Creates a new post

PUT

  • updatePost - Updates an existing post by id

DELETE

  • removePost - Deletes an existing post

We want the structure of a post object to look like

{
	"id" : 1001,
    "content" : "This is a sample post!",
    "author" : "Tech Injektion",
    "createdAt" : "2020-07-16T19:20+01:00",
    "updatedAt" : "2020-07-16T19:20+01:00"
}

Getting Started

We must have the serverless framework installed to start with. Check out this guide on how to install serverless on your system. Make sure you have the latest version of serverless by running.

serverless -version

This should print version information and also if there are any updates available.

$ serverless -version
Framework Core: 1.71.1
Plugin: 3.6.12
SDK: 2.3.0
Components: 2.30.10

Install the AWS CLI on your system so that we can connect to various AWS services.

Once you have AWS CLI installed and configured with your credentials, you can verify its version by running

aws --version

We are now ready to create our serverless function template.

Creating Serverless Python Template

Serverless offers many templates to start with for doing development. We can quickly add all boilerplate code and set up our code base by running

serverless create --template aws-python3 --name post-api --path post-api

This will generate the starter code in the directory post-api. Note that we are using the serverless template aws-python3. The above command will produce output like

Serverless: Generating boilerplate...
Serverless: Generating boilerplate in "/home/witcher/TechInjektion/post-api"
 _______                             __
|   _   .-----.----.--.--.-----.----|  .-----.-----.-----.
|   |___|  -__|   _|  |  |  -__|   _|  |  -__|__ --|__ --|
|____   |_____|__|  \___/|_____|__| |__|_____|_____|_____|
|   |   |             The Serverless Application Framework
|       |                           serverless.com, v1.71.1
 -------'

Serverless: Successfully generated boilerplate for template: "aws-python3"

This would generate a handler.py file and a serverless.yml file used for defining the configuration.

Installing Dependencies and Plugins

You will need to install virtualenv if it is not available and the command is different for Windows systems. Install virtualenv and then create a virtual environment by running the commands below. You can follow this guide to ensure that you have virtualenv set up.

MacOS and Linux:

python3 -m pip install --user virtualenv
python3 -m venv env

Windows:

py -m pip install --user virtualenv
py -m venv env

A virtual environment named "env" should be created and you will be able to see an env directory in your project.

Run the following command to activate the virtual environment you just created.

source env/bin/activate

Install the boto3 dependency that we need to interact with AWS services by running

pip install boto3

Lets the package versions of the environment to a requirements.txt file

pip freeze > requirements.txt

We need to install the serverless-python-requirements plugin to package our python dependencies. We can get it using npm

npm init

Initialize npm with default options by accepting and pressing enter. The below command will install the plugin from npm.

npm install --save serverless-python-requirements

Configuring the Serverless Handler

Clear the serverless.yml file and paste the code below:

service: post-api

provider:
  name: aws
  runtime: python3.8

functions:
  create:
    handler: handler.create
    events:
      - http:
          path: posts/create
          method: post
  get:
    handler: handler.get
    events:
      - http:
          path: posts/get
          method: get
  all:
    handler: handler.all
    events:
      - http:
          path: posts/all
          method: get
  update:
    handler: handler.update
    events:
      - http:
          path: posts/update
          method: put
  delete:
    handler: handler.delete
    events:
      - http:
          path: posts/delete
          method: delete

resources: 
    Resources:
      postsTable: 
        Type: AWS::DynamoDB::Table
        Properties:
          TableName: posts
          AttributeDefinitions:
            - AttributeName: id
              AttributeType: N
          KeySchema:
            - AttributeName: id
              KeyType: HASH
          ProvisionedThroughput:
            ReadCapacityUnits: 1
            WriteCapacityUnits: 1
  
plugins:
- serverless-python-requirements

custom:
  pythonRequirements:
    dockerizePip: non-linux
serverless.yml

The yml file above defines the following:

  • The serverless service provider which is AWS
  • Names and details of the functions
  • Resource details for DynamoDB
  • Serverless python requirements plugin and its custom configuration
  • The custom propety declares that python is used in a dockerized environment when on non linux environments
NOTE: You will need Docker to be installed if you are using Windows. Either install docker or remove the custom property declaration in serverless.yml. Please not that docker processes need Administrator access to run, hence you might need to run your serverless commands with Administrative rights. (right click and open cmd/PowerShell as Administrator)

We have defined all the lambda functions, their paths and the corresponding HTTP methods that our API is going to use. We need to add all the functions in handler.py and write the functions to interact with DynamoDB to perform the CRUD operations on the database. Note that adding the DynamoDB table details in the yml file ensures that this table is created if it not already present when the function is deployed for the first time.

import json
import logging
import boto3
import datetime
import dynamo

logger = logging.getLogger()
logger.setLevel(logging.INFO)
dynamodb = boto3.client('dynamodb')
table_name = 'posts'


def create(event, context):
    logger.info(f'Incoming request is: {event}')

    # Set the default error response
    response = {
        "statusCode": 500,
        "body": "An error occured while creating post."
    }

    post_str = event['body']
    post = json.loads(post_str)
    current_timestamp = datetime.datetime.now().isoformat()
    post['createdAt'] = current_timestamp

    res = dynamodb.put_item(
        TableName=table_name, Item=dynamo.to_item(post))

    # If creation is successful
    if res['ResponseMetadata']['HTTPStatusCode'] == 200:
        response = {
            "statusCode": 201,
        }

    return response


def get(event, context):
    logger.info(f'Incoming request is: {event}')
    # Set the default error response
    response = {
        "statusCode": 500,
        "body": "An error occured while getting post."
    }

    post_id = event['pathParameters']['postId']

    post_query = dynamodb.get_item(
        TableName=table_name, Key={'id': {'N': post_id}})

    if 'Item' in post_query:
        post = post_query['Item']
        logger.info(f'Post is: {post}')
        response = {
            "statusCode": 200,
            "body": json.dumps(dynamo.to_dict(post))
        }

    return response


def all(event, context):
    # Set the default error response
    response = {
        "statusCode": 500,
        "body": "An error occured while getting all posts."
    }

    scan_result = dynamodb.scan(TableName=table_name)['Items']

    posts = []

    for item in scan_result:
        posts.append(dynamo.to_dict(item))

    response = {
        "statusCode": 200,
        "body": json.dumps(posts)
    }

    return response


def update(event, context):
    logger.info(f'Incoming request is: {event}')

    post_id = event['pathParameters']['postId']

    # Set the default error response
    response = {
        "statusCode": 500,
        "body": f"An error occured while updating post {post_id}"
    }

    post_str = event['body']

    post = json.loads(post_str)

    res = dynamodb.update_item(
        TableName=table_name,
        Key={
            'id': {'N': post_id}
        },
        UpdateExpression="set content=:c, author=:a, updatedAt=:u",
        ExpressionAttributeValues={
            ':c': dynamo.to_item(post['content']),
            ':a': dynamo.to_item(post['author']),
            ':u': dynamo.to_item(datetime.datetime.now().isoformat())
        },
        ReturnValues="UPDATED_NEW"
    )

    # If updation is successful for post
    if res['ResponseMetadata']['HTTPStatusCode'] == 200:
        response = {
            "statusCode": 200,
        }

    return response


def delete(event, context):
    logger.info(f'Incoming request is: {event}')

    post_id = event['pathParameters']['postId']

    # Set the default error response
    response = {
        "statusCode": 500,
        "body": f"An error occured while deleting post {post_id}"
    }

    res = dynamodb.delete_item(TableName=table_name, Key={
                               'id': {'N': post_id}})

    # If deletion is successful for post
    if res['ResponseMetadata']['HTTPStatusCode'] == 200:
        response = {
            "statusCode": 204,
        }
    return response
handler.py

In the above code, we have written the 5 methods to perform basic CRUD operations on our DynamoDB table called "posts". You can read more about CRUD operations on DyanmoDB with python here. We have also used some utility functions to convert regular json into DynamoDB supported json. These functions are defined in another file dynamo.py.

# A utility function to convert a dict into DynamoDB object
def to_item(raw):
    if type(raw) is dict:
        resp = {}
        for k, v in raw.items():
            if type(v) is str:
                resp[k] = {
                    'S': v
                }
            elif type(v) is int:
                resp[k] = {
                    'N': str(v)
                }
            elif type(v) is dict:
                resp[k] = {
                    'M': to_item(v)
                }
            elif type(v) is bool:
                resp[k] = {
                    'BOOL': v
                }
            elif type(v) is list:
                resp[k] = []
                for i in v:
                    resp[k].append(to_item(i))
        return resp
    elif type(raw) is str:
        return {
            'S': raw
        }
    elif type(raw) is int:
        return {
            'N': str(raw)
        }

# A utility function to convert a DynamoDB object into a dict(json)
def to_dict(raw):
    if type(raw) is dict:
        resp = {}
        for k, v in raw.items():
            if 'S' in v:
                resp[k] = v['S']
            elif 'N' in v:
                resp[k] = int(v['N'])
            elif 'M' in v:
                resp[k] = to_dict(v['M'])
            elif 'BOOL' in v:
                resp[k] = bool(v['BOOL'])
            elif v is list:
                resp[k] = []
                for i in v:
                    resp[k].append(to_item(i))
    return resp
dynamo.py

In DynamoDB, every value should be preceded by a type key which defines the type of the value. eg.

# Regular JSON for request and response
{
	"id" : 1001,
    "content" : "This is a sample post!",
    "author" : "Tech Injektion",
    "createdAt" : "2020-07-16T19:20+01:00"
}

# Supported by DynamoDB
{
	"id" : {"N" : "1001"},
    "content" : {"S" : "This is a sample post!"},
    "author" : {"S" : "Tech Injektion"},
    "createdAt" : {"S" : "2020-07-16T19:20+01:00"}
}
How DynamoDB stores and returns objects

To know some other supported attributes, check the DynamoDB documentation.

Now that we have our code ready, its time to deploy and test it. Run the following command inside the virtual environment to deploy your Lambda function to AWS.

serverless deploy -v

It will take a while for Cloud Formation to provision all the AWS resources for you and deploy the lambda functions. Once done, navigate to your AWS console and open AWS Lambda. You should be able to see 5 new functions created. Then navigate to API Gateway where we will test our newly added functions.

API gateway for created resources

Click on the /create resource's POST method and then on the TEST icon to test the method.

/create resource POST method

In the request body paste the following and click on Test.

{
	"id" : 1001,
    "content" : "This is a sample post!",
    "author" : "Tech Injektion"
}
Sample create request

We should get a HTTP 201 (created) response.

Add a few more posts by varying the "id" and "content" fields and hitting Test.

# Create post 2
{
	"id" : 1002,
    "content" : "This is a Python post!",
    "author" : "Tech Injektion"
}

# Create post 3
{
	"id" : 1003,
    "content" : "This is a Java post!",
    "author" : "Tech Injektion"
}
Sample json to create more posts

Now we can test if all posts were created by fetching them. To test the "/all" resource click on it and select the GET method and then Test.

The "all" resource should return all the posts present in the database. The createdAt field was also populated by our create function.

Let's fetch a single post by specifying it's id. Test the /get/{postId} resource for this. Enter a valid id in the path variable.

Testing the /get resource with a path parameter

Let's try to update the content of a post by testing the /update/{postId} resource. Let's update the Java post to Go Lang. Use 1003 in the path variable and the body should have the new content.

{
    "content" : "This is a Go Lang post!",
    "author" : "Tech Injektion"
}

You should get a HTTP 200(OK) response.

Navigate to DynamoDB to see the created data in the posts table.

Table items in DynamoDB

Notice that our Java post was updated to Go Lang and also has an updatedAt field. Go back to the API gateway and let's delete the sample post by using the /delete/{postId} resource.

We should get a HTTP 204 response and the item is deleted.

While API gateway helps us test our API quickly it is not the ideal way to consume the API we have built. So you can also try testing the API from curl or any external app like Postman. For doing this, you need to know the endpoint your API is hosted on. In API gateway, click on the "stages" tab and copy the Invoke URL after clikcing on the "dev" stage.

Finding the endpoint in API gateway

We will be using postman for this example. Open a new request in postman and set the URL appended with "posts/all" as the endpoint. Let's send a simple get request to get all current posts from the API.

You should get the below response:

{
    "message": "Missing Authentication Token"
}
Not Authenticated Response from API Gateway

Our API is not public and we must provide authentication parameters to verify our identity before calling the API from an external system. For this purpose, we need a user who has access to API gateway. Let's create a new user for this purpose. Go to IAM -> Users and add a new user. Let's call the user as "api-tester" .

In access type check Programmatic access only.

For permissions, select Attach existing policies and search for API gateway. Check the AmazonAPIGatewayInvokeFullAccess policy and click next.

Review and create the user. Once created you will be able to see the Access key id and secret access key for the user. These are the values needed to authenticate to our API Gateway.

In Postman, click on the Authorization tab and select AWS Signature as the type. Paste your access and secret keys here.

Setting the AWS Signature authentication

We are now ready to send a request to the API.

Response from API gateway

Congratulations! We were successfully able to authenticate with and invoke our API externally to get a list of all posts present in DynamoDB.

You can find the code used in this post on GitHub.