-1

I'm new to the server-less world using Amazon AWS and I try to understand the thumb rules of writing integration tests.

Lets give the following simple system we want to test:

We have a supermarket for which we want to save all the receipts. The system contains a DB with single table for the data. A job queue which contains all the transactions that need to be handled and couple of workers which do some pre-process on each job in the queue and then commit the result to the DB.

We want to test the next scenario:

  1. Initialize new DB and queue
  2. Initialize couple of workers
  3. Insert jobs to the queue
  4. Wait
  5. Validate the result
  6. clean the environment

By cleaning the environment, the next time we run the tests, the same result would come. If we won't do it, and for example, data will left in the DB, it will affect the validation of the next time we run the tests.

My experience is working with dockers. With them, I could docker-compose up a new DB and queues at the set-up and docker-compose down all the containers at tear-down. This is a simple solution.

Now, the question is what's the good way to do such tests for my server-less AWS application? The db is dynamo db. The queues are sqs and the "workers" are lambdas which trigger on sqs insertion. Of course it's an "easy" example and it's quite over kill but the question is larger than the specific case.

The solutions I found so far:

  1. Use local services such as local dynamo DB which is official by Amazon. But also local SQS which is not official nor supported by them. Cons: the results can be different with Amazon services at deployment.
  2. Use sam deploy each time to run the tests on amazon. I can create a new stack but the resources are left in Amazon's servers forever (such as s3 buckets and queues). Cons: the code isn't local and hard to debug + takes time deploy.
  3. Use the same resources for all the tests and clean them at the beginning. Cons: ugly code and option for a lot of code duplication between projects.

From what I searched, there is no common "how to write tests tutorial" and I would like to hear from your experience.

MyNick
  • 536
  • 1
  • 9
  • 25

1 Answers1

0

I do a lot of testing with a similar setup and this is how I approach things. The goal is to find a balance between complete mocks and testing against a running server.

For dynamoDb & elasticsearch I use the docker instances available. Before every test in my Jest setup I clear the db tables and elasticsearch indices. Just wrote some simple code to go through a list of tables and wipe them.

In each test, I seed the necessary tables ensuring that the data will not conflict with other tests since Jest runs in parallel - by that I mean if I am going to have a test to fetch all entries by user then I ensure that this user ID is only used in that test seed to prevent another test seed from affecting results.

I have wrappers for all the aws services I use such as Cognito, SNS, SQS, etc which abstract a bit of aws for me. I do unit tests for these using nock, I especially use the nock.recorder.rec() command which lets me capture the response from aws. I do these tests for all these root level calls. This ensures my code works against live servers but is repeatable without a dependency on these servers.

For all other unit tests (and by unit tests here I am referring to tests that just test a function and do not involve running sls offline for ex) I will use jest mocks and mock the aws method. Since my low level aws wrappers were tested and recorded with nock now I just mock things for easy testing as I feel confident the mock methods work as expected.

For integration/e2e testing when using sls offline things are trickier since I cannot mock/nock when things are loaded in a running server. For this I provide each aws service with a custom endpoint (every aws service has a configuration that accepts this parameters - defaulting to null it will use the real aws server but you can create a local http/express server to mock the responses). Check out this SO question where I answered how to do this with Cognito - the process is the same for all other services.

It takes a bit of work to get this whole testing framework setup but once don't it works pretty flawlessly and also provides a test setup that CI servers can run.

I should also mention the amazing work of LocalStack which provides local docker servers for most aws services. I personally prefer to run just what I need hence my above setup and also they do not provide local abstraction for everything I needed but it's a great option to explore and it may fit your needs well. It's a big stack however (my laptop freaks out if I try to run it).

cyberwombat
  • 38,105
  • 35
  • 175
  • 251