Integration Testing NestJS APIs with a Test Database
Unit tests for APIs can sometimes provide little to no value, so the next logical step is often to write integration tests. You don’t want to touch live data though, and want tests to make assertions against actual (un-mocked) service logic. The following is an approach to do just that.
With all the mocking and stubbing that often comes with unit tests, writing them can feel like a waste of time. For example:
describe('findAll', () => {
it('should return an array of dogs', async () => {
const result = ['Snoopy'];
jest.spyOn(dogService, 'findAll').mockImplementation(() => result);
expect(await dogController.findAll()).toBe(result);
});
});
Since the implementation of findAll()
in dogService
is mocked to just return ['Snoopy']
, all this test really does is assert that dogController
's findAll()
function makes a call to and returns the result of dogService
's findAll()
function. There isn’t too much value in such a test as far as ensuring sound API logic.
Unit testing is for small individual pieces of code (units), such as functions, with heavy use of mocks and stubs. It's great for making sure isolated chunks of code work as expected on their own, but that's not quite what we're looking for here.
Integration testing covers interactions between components of an entire system (think APIs and databases) and thus makes a lot more sense here as it involves simulating calls to API endpoints and checking that table data is read/written as expected. Supertest is a great tool for this as it allows you to simulate API calls and make assertions against responses.
NestJS, which supports Supertest out of the box (and as you can see has a lot of cool cat pics throughout its documentation), provides a useful way to spin up a Nest application context for tests to run against, including the ability to make overrides to the things you include in that mock application.
This post lays out an approach that makes use of this feature to
- write integration tests for Nest APIs using Supertest and
- ensure that test logic runs against a test database as opposed to a default/live database.
This will not be an in-depth project setup walkthrough, but rather a happy-path overview of key pieces needed to achieve our goal. I'm mainly assuming familiarity with NestJS concepts, or better yet, that you have a working NestJS application that the approach can be applied to. If you have neither, that's fine! I hope it's still a valuable read.
Database and Data Source Setup
I used TypeORM as my ORM and PostgreSQL for my database, but the general approach can be applied to any Nest testing regardless of the database technologies used.
The first step is to get the databases up and running. Here I use Docker Compose to launch Postgres instances for a default and test database:
With a simple docker-compose up
command, Docker uses this file to spin up two containers (one for each database), each using the latest postgres version.
Notice the differences in ports: POSTGRES_DEFAULT_PORT
is 5432 while the POSTGRES_TEST_PORT
is simply something else – I used 5433.
Now here is the data source provider where the database connection is made in my Nest code:
The key piece here is how the database
and port
fields are set in the DataSource
constructor – default values are used when no parameters are passed into the provider’s factory function. This is how the live/production NestJS application connects to the default database. I’ll make use of the parameters in a bit when I override this provider for tests.
For now, take a look at how the data source provider can be used to create repository providers via injection:
Repository providers can then be used for database operations inside of API service logic, like so:
Populate Test Database
The databases are up and running and we just saw how APIs can operate on the database via repository provider injection…by way of data source provider injection (injection Inception?).
Now, the test database needs to be populated with actual tables since the Docker script from earlier only created a table-less database for us.
The approach I took on this was to write a simple script that connects to the test database and runs synchronize() on it, which updates the database with the entities (tables) passed into the data source initialization:
Notice the parameters being passed into the data source initialization function that we saw earlier. This is how we connect to the test database instead of the default one. As we’ll see in a bit, I make the exact same function call to override the data source provider on the mock Nest context.
Synchronizing creates (the very first time the script is run) or updates (subsequent script runs) the tables for entities according to how they are defined in their respective TypeORM entity classes. Let’s take a look back at the data source function from earlier:
So in this example, the script creates the dog
table when it’s run the first time and updates it any time there are changes to its entity class thereafter.
The script can be run via an npm command (e.g. npm run test:init
).
The dog
table is now created and ready for some test runs!
Run Tests
Here is where the rubber meets the road. For my service’s test file, I initialize the aforementioned Nest application context with the data source provider overridden as described in this documentation:
Note: Nest is agnostic to testing frameworks but supports Jest out of the box, which I’m using here.
Now let's break this down.
First, in the beforeAll()
block, a TestingModule
(which is akin to a Nest application’s root AppModule
) is created. In its initialization you can see where I override the specified provider, in this case DATA_SOURCE
, with what’s returned from the useFactory()
function: a new test-specific DataSource
:
...
.overrideProvider(DATA_SOURCE)
.useFactory({
factory: async (): Promise<DataSource> => {
return getInitializedDataSource(
process.env.POSTGRES_TEST_DB,
process.env.POSTGRES_TEST_PORT,
);
},
})
...
Now all API table operations during test runs are done against the test database. Hip-hip hooray!
This mocked module is then used to resolve a helper service (insertMockDog()
on the next line saves a puppy named ‘Snoopy’ to the test database’s dog
table), create the application context, and set up a mock server for Supertest calls:
...
const helperService: HelperService = await mockModule.resolve(HelperService);
await helperService.insertMockDog(mockDogName);
const mockApp: INestApplication = mockModule.createNestApplication();
await mockApp.init();
const mockAppServer: any = mockApp.getHttpServer();
...
Then in the lone unit test, the dog service is hit via Supertest and assertions are made to see if it finds ‘Snoopy’ in the table:
...
it('should successfully find dog given name', async function () {
const response = await request(mocks.appServer).get(`/dogs/byName?name=${mockDogName}`);
const results = response.body;
expect(response.status).toBe(200);
expect(results.length).toBe(1);
expect(results[0].name).toBe(mockDogName);
});
...
Finally in the afterAll()
block, the mock app context is closed and the helper service is used to close the data source. This teardown is required to prevent Jest leaks:
...
afterAll(async function () {
await mocks.app.close();
await mocks.helperService.disconnectFromDatabase();
});
...
Here's the implementation of disconnectFromDatabase()
:
Sweet! APIs can now be hit freely with their logic un-mocked and hitting the (test) database, just in like real life.
You see how the helper service can be used for all of your setup and teardown needs, as well as how the logic in beforeAll()
could be functioned out for reuse in other .spec
files, like so:
With this setup, any service test file simply passes its module into the function, adds its provider to the providers
list, and gets back its own object with the mocked context it needs to run great integration tests.
Moving Up Out of Local
Let’s now take a look at how this approach can apply to test runs in the CI/CD pipeline.
For this application, tests are automatically run when any GitHub pull requests are opened to its main branch. This is done via workflows. Read all about 'em!
As you can see in the workflow steps, a postgres database instance is spun up before tests are run, then our handy test:init
script is run to populate the database with tables, and finally the actual tests run. Cool!
Conclusion
I hope this post helps others out there struggling with testing in NestJS like I was. The idea is to have API service logic actually tested (not mocked) using a dedicated test database and in a way that takes advantage of Nest’s great architecture. I believe this approach can help you accomplish that.
Thanks for reading!