1

This question is extracted from a comment I posted here: What's the best strategy for unit-testing database-driven applications?

So I have a huge database schema for a legacy application (with quite an old code base) that has many tables, synonyms, triggers, and dblinks. We and we have (finally) started to test some part of the application.

Our tests are already using mocks, but in order to test the queries that we are using we have decided to use an in-memory db with short-lived test dataset.

But the setup of the in-memory database requires a specific SQL script for the db schema setup. The script is not the real DDL we have in production because we can not import it directly.

To make things harder, the database contains functions and procedures that needs to be implemented in Java (we use the h2 db, and that is the way to declare procedures).

I'm afraid that our test won't break the day the real db will change and we will spot the problem only at runtime, potentially in production.

I know that our tests are quite at the border between integration and unit. However with the current architecture it is quite hard to insulate the test from the db. And we want to have proper tests for the db queries (no ORM inside).

What would be solution to have a DDL as close as possible of the real one and without the need to manually maintain it ?

Community
  • 1
  • 1
Guillaume
  • 5,488
  • 11
  • 47
  • 83
  • Face it: As soon as you're integrating with an external system (DB) you're no longer on the unit test level any more. That being said, if you take a copy of the production DB, at least anonymize the user data in there: Nothing's more annoying than sending test notification emails to real users - for both ends. But the real answer probably is: This is one of the problems that can be solved with just another level of indirection (not that I say it should). If you have an untestable system: introduce codereview and refactor to a testable one. Or accept the untestability & keep codereviews going – Olaf Kock Aug 29 '14 at 12:55
  • You could also script the DB, rebuild it as DB_Test, fill it with fake data, and run your tests on there. A bit of effort up front, but it could be very useful if you plan to continue work on this software for a long time. – Jonathon Anderson Aug 29 '14 at 21:51

1 Answers1

0

If your environments are Dockerized I would highly suggest checking out Testcontainers (https://www.testcontainers.org/modules/databases/). We have used it to replace in-memory databases in our tests with database instances created from production DDL scripts.

Additionally, you can use tmpfs mounting to get performance levels similar to in-memory databases. This is nicely explained in following post from Vlad Mihalcea: https://vladmihalcea.com/how-to-run-integration-tests-at-warp-speed-with-docker-and-tmpfs/.

This combination works great for our purposes (especially when combined with Hibernate auto-ddl option) and I recommend that you check it out.

lmartinez
  • 1
  • 1