- Overview
- Installation and setup
- E2E testing support setup & configuration
- AWS event mocks
- Examples
- An opinionated approach to serverless testing
Feedback is appreciated! If you have an idea for how this plugin/library can be improved (or even just a complaint/criticism) then please open an issue.
Running tests on deployed services (vs locally mocked ones) is an important final step in a robust serverless deployment pipeline because it isn't possible to recreate all aspects of a final solution locally - concerns such as fine-grained resource access through IAM and scalability/performance characteristics of the system can only be assessed while the application is running on AWS. Running these tests on stage/branch-specific versions of the application (see serverless testing best practices below) is difficult to do given the dynamic nature of AWS resource naming. This library makes it easier to write post-deployment tests for applications and services written and deployed using the Serverless Framework by locally persisting dynamic AWS resource information such as endpoint URLs and exposing them to your tests via easily-imported helper functions.
Because unit tests with mocked AWS services are still an important part of a well-tested service (especially for fast developer feedback), this library also includes helper functions to simplify the creation of mock events for the various AWS Lambda integrations.
Install and save the library to package.json
as a dev dependency:
npm i --save-dev serverless-plugin-test-helper
yarn add serverless-plugin-test-helper -D
Two parts of this library work together to support E2E testing of your deployed serverless apps:
- A Serverless Framework plugin which extends
sls deploy
to save a copy of the generated CloudFormation Stack Output locally - this will persist the dynamically-generated API Gateway endpoint, for example. - A standard Node.js library which can be imported to access local stack output values in tests (or any other code you want to run post-deployment) - this will allow you to access the dynamically-generated API Gateway endpoint that the plugin saved.
To setup the plugin add the library to the serverless.yml
plugins
section:
plugins:
- serverless-plugin-test-helper
By default the plugin will generate a file containing stack outputs at .serverless/stack-output/outputs.yml
, which is where the library pulls values from. You can optionally specify an additional path for storing outputs by using the optional serverless.yml
custom
section with the testHelper
key:
custom:
testHelper: # The 'testHelper' key is used by the plugin to pull in the optional path value
path: optional/path/for/another/outputs[ .yml | .yaml | .json ]
Import the helper functions into your test files to retrieve values from deployed stack output:
import { getApiGatewayUrl, getDeploymentBucket, getOutput } from 'serverless-plugin-test-helper';
const URL = getApiGatewayUrl();
const BUCKET_NAME = getDeploymentBucket();
const DOCUMENT_STORAGE_BUCKET_NAME = getOutput('DocumentStorageBucket');
getApiGatewayUrl()
returns the url of the deployed API Gateway service (if usinghttp
orhttpApi
as an event type inserverless.yml
)getDeploymentBucket()
returns the name of the bucket Serverless Framework generates for uploading CloudFormation templates and zipped source code files as part of thesls deploy
processgetOutput('output-key-from-stack-outputs')
returns the value of the Cloudformation stack output with the specified key
To see what output values are available for reference you can check the generated .serverless/stack-output/outputs.yml
file after a deployment. To make additional values available you can specify up to 60 CloudFormation Stack Outputs in serverless.yml
using the resources > Outputs section:
resources:
Outputs:
# Generic example
Output1: # This is the key that will be used in the generated outputs file
Description: This is an optional description that will show up in the CloudFormation dashboard
Value: { Ref: CloudFormationParameterOrResourceYouWishToExport }
# Example referencing a custom S3 bucket used for file storage (defined under Resources section below)
DocumentStorageBucket: # This is the key that will be used in the generated outputs file
Description: Name of the S3 bucket used for document storage by this stack
Value: { Ref: DocumentStorageBucket }
Resources:
DocumentStorageBucket:
Type: AWS::S3::Bucket
See the AWS CloudFormation documentation on outputs for more information on stack outputs.
Import the helper functions and static objects into your test files to generate AWS event and method signature mocks with optional value overrides. Note that this portion of the library can be used without using the E2E testing module.
import {
ApiGatewayEvent,
ApiGatewayTokenAuthorizerEvent,
DynamoDBStreamEvent,
HttpApiEvent,
SnsEvent,
context
} from 'serverless-plugin-test-helper';
import { handler } from './lambda-being-tested';
// Setup events with optional value overrides
const event = new ApiGatewayEvent({ body: 'overridden body value' });
const event2 = new ApiGatewayTokenAuthorizerEvent();
const event3 = new DynamoDBStreamEvent();
const event4 = new HttpApiEvent();
const event5 = new SnsEvent();
...
// Invoke the handler functions with events
const result = await handler(event, context);
const result2 = await handler(event2, context);
// TODO write your tests on the results
There is one working example of how this library can be used in a simple 'hello world' serverless application:
Due to tight coupling with managed services and the difficulty in mocking those same services locally, end-to-end testing is incredibly important for deploying and running serverless applications with confidence. I believe that a good serverless deployment pipeline setup should include the following steps, in order:
- Install project and dependencies
- Run unit tests
- Deploy to a static, non-production environment like staging (using
--stage staging
option in Serverless Framework)† - Run e2e tests in the static, non-production environment†
- Optional: include a manual approval step if you want to gate production deploys
- Deploy to production environment (with
--stage production
) - Run e2e tests in production
† Repeat steps 3 and 4 for however many static, non-production environments you have (development, staging, demo, etc.)
- Install project and dependencies
- Run unit tests
- Deploy to a dynamic, non-production environment (with
--stage <branch or username>
option in Serverless Framework) - Run e2e tests in the dynamic, non-production environment
- Automate the cleanup of stale ephemeral environments with a solution like Odin
* Note that these kinds of pipelines work best using trunk based development