65

I want to write unit tests with NUnit that hit the database. I'd like to have the database in a consistent state for each test. I thought transactions would allow me to "undo" each test so I searched around and found several articles from 2004-05 on the topic:

These seem to resolve around implementing a custom attribute for NUnit which builds in the ability to rollback DB operations after each test executes.

That's great but...

  1. Does this functionality exists somewhere in NUnit natively?
  2. Has this technique been improved upon in the last 4 years?
  3. Is this still the best way to test database-related code?

Edit: it's not that I want to test my DAL specifically, it's more that I want to test pieces of my code that interact with the database. For these tests to be "no-touch" and repeatable, it'd be awesome if I could reset the database after each one.

Further, I want to ease this into an existing project that has no testing place at the moment. For that reason, I can't practically script up a database and data from scratch for each test.

abatishchev
  • 98,240
  • 88
  • 296
  • 433
Michael Haren
  • 105,752
  • 40
  • 168
  • 205

6 Answers6

79

NUnit now has a [Rollback] attribute, but I prefer to do it a different way. I use the TransactionScope class. There are a couple of ways to use it.

[Test]
public void YourTest() 
{
    using (TransactionScope scope = new TransactionScope())
    {
        // your test code here
    }
}

Since you didn't tell the TransactionScope to commit it will rollback automatically. It works even if an assertion fails or some other exception is thrown.

The other way is to use the [SetUp] to create the TransactionScope and [TearDown] to call Dispose on it. It cuts out some code duplication, but accomplishes the same thing.

[TestFixture]
public class YourFixture
{
    private TransactionScope scope;

    [SetUp]
    public void SetUp()
    {
        scope = new TransactionScope();
    }

    [TearDown]
    public void TearDown()
    {
        scope.Dispose();
    }


    [Test]
    public void YourTest() 
    {
        // your test code here
    }
}

This is as safe as the using statement in an individual test because NUnit will guarantee that TearDown is called.

Having said all that I do think that tests that hit the database are not really unit tests. I still write them, but I think of them as integration tests. I still see them as providing value. One place I use them often is in testing LINQ to SQL code. I don't use the designer. I hand write the DTO's and attributes. I've been known to get it wrong. The integration tests help catch my mistake.

Paul Williams
  • 16,585
  • 5
  • 47
  • 82
Mike Two
  • 44,935
  • 9
  • 80
  • 96
  • 3
    After using this approach for a couple weeks, I'm very happy with it, thanks again! – Michael Haren Dec 30 '08 at 20:36
  • 1
    I ended up using a very similar pattern, but with a base class that deals with the database trivia, including setting up the connections and whatnot. – Bruno Lopes Jan 02 '09 at 23:46
  • 1
    The only problem is surely if you don't commit, you cannot then test the data has been committed to the database? i.e. I'd like my test to call code that calls the DB, then do some asserts on the DB to verify the data, but finally rollback all those changes when the test or test suite is complete. Though it's a valid point to say these aren't really unit tests. Personally I mock the DAL generally, but it's useful to have explicit DB tests that aren't run on an automated run. – tjmoore Jun 02 '10 at 17:15
  • 6
    @tjmoore - as long as you query the data using the same connection you can see the rows since you are "inside" the transaction. In the sample code in the answer above you would call something that did some inserts perhaps and then query the data back and check that it contains the expected values. Once the test completes and the `TransactionScope` rolls back nothing will be left. Since your query is enlisted in the same transaction as the insert the rows will be found. Generally I also mock the DAL, but as you said, sometimes you just have to prove the insert is going to work. – Mike Two Jun 02 '10 at 19:37
  • Of course, what I've realised is the code I have under test uses a DAL that maintains the DB connections (actually via the Enterprise Library) and without a re-write just for the tests, the connection is opened and closed on each DAL operation. Ah well, another approach then. – tjmoore Jun 07 '10 at 16:30
  • @tjmoore - It should still work. That's the thing about `TransactionScope`. When a new connection is created it will automatically enlist in the current `TransactionScope`. Since the transaction scope will span multiple connections you will need to enable the MSDTC (Microsfot Distributed Transaction Coordinator). But it will work. I've used this code in that situation. – Mike Two Jun 08 '10 at 01:11
  • @MikeTwo I have implemented and adopted this pattern but i am having an issue where the sql tables i'm interacting with have an identity column and this column is being incremented even though the transactionscope is not being committed. Any suggestions? – propagated Apr 25 '14 at 03:33
  • @propagated That's kind of expected http://stackoverflow.com/questions/282451/ There might be ways around it, but I'm not the best database resource. I've never cared what my identity columns values were so this never bothered me. Also I'm only running tests against a local db or a test db so it wasn't an issue. Sorry I couldn't help. – Mike Two Apr 25 '14 at 03:47
  • @MikeTwo No problem, thanks for your response. Yes, i read that post and retooled how i was thinking about it in terms of my tests and realized i didn't care. All good now. – propagated Apr 25 '14 at 15:20
  • Thanks for pointing to the TransactionScope class! Works perfectly with Petapoco and Nunit! – Mivaweb Oct 01 '16 at 07:58
3

I just went to a .NET user group and the presenter said he used SQLlite in test setup and teardown and used the in memory option. He had to fudge the connection a little and explicit destroy the connection, but it would give a clean DB every time.

http://houseofbilz.com/archive/2008/11/14/update-for-the-activerecord-quotmockquot-framework.aspx

nportelli
  • 3,934
  • 7
  • 37
  • 52
2

I would call these integration tests, but no matter. What I have done for such tests is have my setup methods in the test class clear all the tables of interest before each test. I generally hand write the SQL to do this so that I'm not using the classes under test.

Generally, I rely on an ORM for my datalayer and thus I don't write unit tests for much there. I don't feel a need to unit test code that I don't write. For code that I add in the layer, I generally use dependency injection to abstract out the actual connection to the database so that when I test my code, it doesn't touch the actual database. Do this in conjunction with a mocking framework for best results.

tvanfosson
  • 524,688
  • 99
  • 697
  • 795
  • Unfortunately this approach is not practical for my projects (hundreds of tables, procedures, gigs of data). This is too high-friction to justify on an existing project. – Michael Haren Nov 26 '08 at 16:11
  • But your unit tests should be broken up in to smaller, more focused classes that don't touch all of the tables. You only need to deal with the tables this particular test class touches. – tvanfosson Nov 26 '08 at 16:13
  • Also, retrofitting unit tests on existing projects is probably best done on an "as needed" basis -- like when you need to refactor or fix a bug. Then you can write a "box" of tests around the existing code to guarantee that your changes don't break things (or fix the bug). – tvanfosson Nov 26 '08 at 16:15
  • I wish that were true. I really do. Plus, I don't want to have to write lots and lots of fixture code just to get the db into a "ready to go" state. – Michael Haren Nov 26 '08 at 16:15
0

Consider creating a database script so that you can run it automatically from NUnit as well as manually for other types of testing. For example, if using Oracle then kick off SqlPlus from within NUnit and run the scripts. These scripts are usually faster to write and easier to read. Also, very importantly, running SQL from Toad or equivalent is more illuminating than running SQL from code or going through an ORM from code. Generally I'll create both a setup and teardown script and put them in setup and teardown methods.

Whether you should be going through the DB at all from unit tests is another discussion. I believe it often does make sense to do so. For many apps the database is the absolute center of action, the logic is highly set based, and all the other technologies and languages and techniques are passing ghosts. And with the rise of functional languages we are starting to realize that SQL, like JavaScript, is actually a great language that was right there under our noses all these years.

Just as an aside, Linq to SQL (which I like in concept though have never used) almost seems to me like a way to do raw SQL from within code without admitting what we are doing. Some people like SQL and know they like it, others like it and don't know they like it. :)

Mike
  • 1,625
  • 16
  • 29
0

For this sort of testing, I experimented with NDbUnit (working in concert with NUnit). If memory serves, it was a port of DbUnit from the Java platform. It had a lot of slick commands for just the sort of thing you're trying to do. The project appears to have moved here:

http://code.google.com/p/ndbunit/

(it used to be at http://ndbunit.org).

The source appears to be available via this link: http://ndbunit.googlecode.com/svn/trunk/

Scott Lawrence
  • 6,993
  • 12
  • 46
  • 64
0

For anyone coming to this thread these days like me, I'd like to recommend trying the Reseed library I'm developing currently for this specific case.

Neither in-memory db replacement (lack of features) nor transaction rollback (transactions can't be nested) were a suitable option for me, so I ended up with a simple delete/insert cycle for the data restore purpose. Ended up with a library to generate those, while trying to optimize my tests speed and simplicity of setup. Would be happy if it helps anyone else.

Another alternative I'd recommend is using database snapshots to restore data, which is of comparable performance and usability. Workflow is as follows:

  • delete existing snapshots;
  • create db;
  • insert data;
  • create snapshot ;
  • execute test;
  • restore from snapshot;
  • go to "execute test" until none left;
  • drop snapshot.

It's suitable if you could have the only data script for all the tests and allows you to execute the insertion (which is supposed to be the slowest) the only time, moreover you don't need data cleanup script at all.

For further performance improvement, as such tests could take a lot of time, consider using a pool of databases and tests parallelization.

Uladzislaŭ
  • 1,680
  • 10
  • 13