Test Cloudflare Workers with Jest, Wrangler and Travis

Learn how to automate functional and integrated testing for your Cloudflare Worker using Jest, Wrangler and Travis.

Test Cloudflare Workers with Jest, Wrangler and Travis

Previously, we have looked at ways to improve the performance of a Ghost Blog. One of the tools we used to achieve this was to combine the Cloudflare Workers + Image Resizing capabilities. While working on it, I wondered if there was an easier way to perform automated tests for my Workers.

This article will guide you through the setup and configuration needed to ensure your Workers are tested before deployment!

( Photo by MD_JERRY )

Sample code

Before we move forward, I wanted to mention that the complete code for this tutorial is available on Github. You can use it to follow the article more clearly, and you are more than welcome to comment / fork / etc...

The Worker code is quite simple. It implements an API accepting query string parameters, performing simple math (addition, subtraction, multiplication and division) and finally returning an outcome in JSON format.

Our objective is to improve and automate our test coverage, both from a functional standpoint and an end-to-end perspective.

Testing

In terms of testing your Javascript code, there are plenty of options available.

My favourite is Jest: it's a lightweight and expressive framework which allows me to write concise and readable tests. It is also straightforward to plug into the build pipeline.

If you use npm, here is how you install it:

npm install --save-dev jest

Then, we can modify our package.json to include the following configuration:

  "scripts": {
    "test": "jest --coverage"
  },
  "jest": {
    "collectCoverageFrom": [
      "index.js",
      "src/*.js"
    ]
  }
package.json

With this, I defined the test script which will run jest and the test suites that have been added to my project. We will look at these in more detail later. The --coverage flag will tell Jest to check how much of our code is covered by a test.

With the  collectCoverageFrom option, we tell Jest which files we want to measure coverage on. In my project, index.js is the main Worker script and the src folder contains what I factored out, to make testing a bit easier.

Let's now have a look at the tests. For example, I have a maths.test.js (link) which defines the tests for my maths.js (link) object. Here is an excerpt:

const Maths = require('../src/maths')

it('Sums', () => {
    const test = new Maths(1,2)
    expect(test.sum()).toBe(3)
});

it('Subtracts', () => {
    const test = new Maths(9,2)
    expect(test.subtract()).toBe(7)
});

it('Subtracts (negative result)', () => {
    const test = new Maths(4,10)
    expect(test.subtract()).toBe(-6)
});
Some of the tests I have defined on my code

The above tests  are ensuring that the basic business logic is correct. Once we have completed them, we can run npm run tests to verify that the outcome is the desired one. Here is some sample output from another project of mine:

$ npm run test

> image-resizing@1.0.0 test /image-resizing
> jest --coverage

 PASS  test/imageComponents.test.js
 PASS  test/resizerOptions.test.js
---------------------|---------|----------|---------|---------|-------------------
File                 | % Stmts | % Branch | % Funcs | % Lines | Uncovered Line #s
---------------------|---------|----------|---------|---------|-------------------
All files            |    64.1 |    81.82 |      75 |   62.16 |
 image-resizing      |       0 |        0 |       0 |       0 |
  index.js           |       0 |        0 |       0 |       0 | 7-37
 image-resizing/src  |     100 |      100 |     100 |     100 |
  imageComponents.js |     100 |      100 |     100 |     100 |
  resizerOptions.js  |     100 |      100 |     100 |     100 |
---------------------|---------|----------|---------|---------|-------------------

Test Suites: 2 passed, 2 total
Tests:       9 passed, 9 total
Snapshots:   0 total
Time:        2.625 s, estimated 3 s
Ran all test suites.
Sample jest output

In the summary, we can see if the tests were successful, and if not it will highlight which assertions have not been met. It will also give us an idea of how much code is being covered with our testing scripts.

Building & Testing pipeline

Once we have added our tests, we can start plugging our project into a CI tool. For open-source projects, we can use Travis. You can follow the full setup documentation here: to begin with, sign up and provide the appropriate authorizations for the repository of interest.

Once this has been done, you can add a .travis.yml configuration file to the root of the repository. Here is my example (link)

os: linux
dist: focal
language: node_js
node_js:
    - node
before_install:
    - pip install --user codecov
    - npm i @cloudflare/wrangler -g
after_success:
    - codecov --file coverage/lcov.info --disable search
.travis.yml

The configuration instructs Travis on the steps to take to prepare the build, before running the tests, and what to do if the outcome is successful. For a complete explanation, I recommend again the official documentation.

Testing the worker end to end

Great, we have seen how to add functional test for our code, and how to automate the build and test process for our Cloudflare Worker. Something is missing still: we are not yet testing our worker code end to end.

Cloudflare provides Wrangler as a CLI tool to manage and interact with Workers for development, testing and deployment. Not too long ago, wrangler dev was released, facilitating local testing. The developer can fire up wrangler dev and send test requests to localhost, while Wrangler takes care of deploying the test code on the nearest Cloudflare data centre and sending your local requests there, for a high fidelity test environment.

What if we could automate this with Jest, and add some integration tests to our Jest test suites?  In the next section, we will see an approach that I am experimenting with, still with some quirks!

🚨 Important: this setup is still experimental. As the wrangler cli itself explains in its output log:

wrangler dev is currently unstable and there are likely to be breaking changes! For this reason, we cannot yet recommend using wrangler dev for integration testing.

Keep this in mind while experimenting!

[1] Write Jest test suite for integrated test

First of all, in addition to our functional tests, we will add a test suite firing HTTP calls to our Worker and validate each outcome. To implement this, I used the node-fetch package and wrote my test suite (link):

const fetch = require('node-fetch');

it('Adds correctly (with op)', async () => {
  return fetch('http://127.0.0.1:8787/?m=201&n=1&op=add')
    .then( res => res.json())
    .then( json => {  
      expect(json.outcome).toBe("OK")
      expect(parseInt(json.result)).toBe(parseInt(202))
    })
});

it('Subtracts correctly', async () => {
  return fetch('http://127.0.0.1:8787/?m=100&n=30&op=sub')
    .then( res => res.json())
    .then( json => {  
      expect(json.outcome).toBe("OK")
      expect(parseInt(json.result)).toBe(parseInt(70))
    })
});
Example integrated tests 

The tests make HTTP calls to the local endpoint available at 127.0.0.1:8787. This is where wrangler dev will expose the test environment. We can then verify directly the responses produced by our Worker and make assertions on their contents.

[2] Add wrangler dev to the 'test' script and coordinate

The next step is to ensure that wrangler dev is up and running before we start our tests. The idea is to use the Travis configuration to prepare and coordinate for this.

The first part is achieved in our .travis.yml file (as seen above)

os: linux
dist: focal
language: node_js
node_js:
    - node
before_install:
    - pip install --user codecov
    - npm i @cloudflare/wrangler -g
after_success:
    - codecov --file coverage/lcov.info --disable search
.travis.yml

In particular, in the before_install section, we tell Travis to install Wrangler so it's available for  later use.

The other thing we need to do is to create a couple of environment variables for our Cloudflare Account ID and Cloudflare API Token, which are required by wrangler. You can see the documentation explaining how to add it as a Repository variable here, or save it as an encrypted variable in your .travis.yml file. In my case, I picked the first option.

Let's now go back to the test script line that I have in my package.json (link)

 "scripts": {
    "test": "concurrently --success first --kill-others \"wrangler dev \" \"wait-on -d 3000 -t 30000 http-get://localhost:8787 && jest --coverage\""
  },

A few things are happening:

  • I am using concurrently, a Node tool which allows running multiple commands in parallel. In our case, we want to run wrangler dev and also our jest testing command so that we can execute our tests at the appropriate time.
  • As you can imagine, running things in parallel immediately calls for coordination😅. For that, I am using another utility called wait-on which allows me to "align the stars" before running my tests.

Looking more closely at the wait-on parameters, I am defining a delay -d of 3 seconds before starting my readiness check. The check fires an HTTP GET call to localhost:8787 and expects an HTTP 200 response back. It will continue to perform this check for at most 30 seconds, before giving up.

To summarise:

  • We use concurrently to spawn wrangler dev and wait-on
  • wrangler dev will attempt to deploy a test worker and also expose an HTTP endpoint locally on port 8787, ready to accept incoming test requests.
  • In the meantime, wait-on will wait 3 seconds and then will start making HTTP GET local requests on port 8787, expecting an HTTP 200 code back.
  • Once wrangler has started up, the check will succeed and wait-on will terminate, at which point we will launch jest --coverage
  • jest --coverage runs the functional and integrated tests for our Worker
  • ✅If the tests are positive, jest will exit with a success code. Because we told concurrently to wait for the first success code and then also terminate successfully, the overall result will be positive.
  • ❌ If the tests are negative, jest will exit with a failure code, which will ultimately cause the test command to also fail.

As a further step, we could also use Travis and Wrangler to deploy our code in case of success, and notify in case of failure. We will keep this for a future iteration.

Final results

From my perspective, this method works 100% of the time when I try it locally. The coordination kicks in and we run our tests only when wrangler dev has done its things and is ready to accept test requests:

Output of npm run test in my sample project
Output of npm run test in my sample project

If I force some of the tests to fail, I also obtain the expected outcome. Below, I am absolutely adamant that 6 * 5 = 300 and Jest faithfully obliges to my wishes 😀

Output of npm run test when test is forced to fail
Output of npm run test when test is forced to fail

This approach also works fine in Travis, meaning I can fully automate the testing of my serverless Cloudflare Worker.

What do you think? Have you tried to achieve this in another way? Let me know in the comments!