Microservice using serverless and local s3 plugin

13-04-2023
source code

Introduction

In this article I will go through creating an aws lambda that will resize an image, the call to the lambda will be triggered by an s3 event. The full example can be run locally without the need to create the lambda on aws.

Background

I was always a fan of running your development stack locally. With the introduction of microservices and cloud infrastructure this task becomes daunting. The introduction of frameworks like serverless made a huge difference toward solving this issue. In this article I will show you how to create a lambda that will be triggered by an s3 event, the lambda will read the new uploaded image and create a thumbnail out of it then store the new image in another s3 bucket. Although the logic of this application is not complex (resizing an image) but the fact that you can test everything locally is very powerful.

Application overview

The application was created using serverless starter template. I have added 2 serverless plugins namely: serverless-s3-local and serverless-offline. serverless-s3-local Will facilitate running a local s3 instance and serverless-offline will allow to run lambda locally. I have also added a utility script to upload an image to trigger the s3 event ObjectCreated.

Testing locally

After cloning the repo locally you need to install dependencies by running: npm i Open a new terminal tap and start the serveless offline by running this command npm run run:offline Go back to the first tab and run this command to upload an image: node upload-image.js. This command with execute this script to upload an image which can be found under resources folder

index file

This file contains the s3hook handler which will respond to s3 event listed in serverless.yml. Below is the content of this file

const AWS = require('aws-sdk');
const sharp = require('sharp');
require('dotenv').config();

const s3hookConfig = {
  s3Url: process.env.S3_URL,
  thumbnailBucketName: process.env.THUMBNAIL_BUCKET_NAME,
  accessKeyId: process.env.ACCESS_KEY_ID,
  secretAccessKey: process.env.SECRET_ACCESS_KEY,
}

module.exports.s3hook = async (event, context) => {

  // S3 instance
  const S3 = new AWS.S3({
    s3ForcePathStyle: true,
    accessKeyId: s3hookConfig.accessKeyId,
    secretAccessKey: s3hookConfig.secretAccessKey,
    endpoint: new AWS.Endpoint(s3hookConfig.s3Url),
  });

  // Retrieve bucket name and file name 
  const bucket = event.Records[0].s3.bucket.name;
  const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\+/g, ' '));
  const params = {Bucket: bucket, Key: key};

  // read image
  const response = await S3.getObject(params).promise();

  // resize
  const thumbnail = await sharp(response.Body)
    .resize(320, 240).toBuffer()

  // save thumbnail
  await S3.putObject({
    Bucket: s3hookConfig.thumbnailBucketName,
    Key: key,
    Body: thumbnail
  }).promise();

  // delete file
  await S3.deleteObject(params).promise();

  return {
    statusCode: 200,
    body: JSON.stringify(
      {
        message: `Thumbnail image with name ${key} was successfully created`,
      }),
  };
};

s3hookConfig contains configuration values that can be changed if this lambda would run on different environments. To change the values for this config you need to edit the .env file.

The rest of the file is self-explanatory. First I am extracting bucket name and key which is file name, these values are needed so I can read the uploaded file.

Next I am resizing the image using sharp package then save it to the thumbnail bucket. After that I am deleting the original image, note that this last step is optional.

upload-image file

This file can be run as a script to upload an image. The image is provided in the resources folder. This file is designed to test the workflow locally so configuration values are hardcoded which is fine in this instance.

const fs = require('fs');
const path = require('path');
const AWS = require('aws-sdk');

const uploadImage = async () => {
    const imagePath = path.resolve(__dirname, 'resources/beach.jpeg');
    const image = fs.readFileSync(imagePath);
    const S3 = new AWS.S3({
        s3ForcePathStyle: true,
        accessKeyId: "S3RVER",
        secretAccessKey: "S3RVER",
        endpoint: new AWS.Endpoint("http://localhost:4569"),
    });
    await S3.putObject({
        Bucket: "photos",
        Key: "beach.jpeg",
        Body: image
    }).promise();
}

uploadImage()
    .then(() => console.log("Image uploaded successfully"));

uploadImage function reads the image from filesystem then upload it to s3 bucket using putObject method on the S3 instance.

serverless.yml file

I have changed this file to allow this lambda to work locally.

service: lambda-thumbnail-creator
frameworkVersion: '3'

provider:
  name: aws
  runtime: nodejs18.x
  timeout: 30

functions:
  s3hook:
    handler: index.s3hook
    events:
      - s3: photos
        event: s3:ObjectCreated:*

plugins:
  - serverless-s3-local
  - serverless-offline
custom:
  s3:
    host: localhost
    directory: ./local-s3

The custom:s3 section is where I am defining my local s3 bucket. The local-s3 can exist anywhere in the file system however in this instance it is in the project codebase. Under functions drction I am defining my s3hook lambda and specifying the s3 event which will trigger it, in this case it is the ObjectCreated event.

I have also added timeout under provider section as lambda was timing out. The rest of the file is what you get from serverless starter template.

Room for improvements

This example can be used as a starter for a small microservice that interacts with aws s3 based events. There are a number of changes that can be added to make this more robust. Here are some suggestions.

  • Error handling: What to do if one of the calls failed?
  • File type restriction: This lambda should make sure that the uploaded file is in fact an image before attempting to resize it

Conclusion

In this blog I have explained how to create an aws lambda that can respond to s3 events. I have also explained how this can be tested locally.