5

I am trying to do the following for UI test automation:

[SetUp]
public void TestSetUp()
{
    _scope = new TransactionScope();
}

[TearDown]
public void TearDown()
{
    _scope.Dispose();
}

[Test]
public void SomeTest()
{
    Utilities.SomeDeleteTransaction(companyCode);    
}

I am trying to execute one Update query and in [Test] and do some stuff with UI and rollback that transaction in [TearDown] which runs after the test. I am not sure if I am doing it right. But, I am trying to (probably commit) that transaction so that I can see it's effect on UI and rollback same transaction so my DB stays the same. Can I accomplish that with TransactionScope or some other class?

Edit

This question is mostly to handle the database known state for selenium testing. Since, my database is brought down from production monthly, I want to be able to execute some insert/update/delete sql script to modify db before tests and then do some UI testing with Selenium and then rollback in Teardown (tests are written using NUnit) to make sure db does not have any influence on tests and stays same after tests.

Saifur
  • 16,081
  • 6
  • 49
  • 73
  • If you want to return to *empty* (every table is empty) database after each test you may use Respawn library: https://github.com/jbogard/Respawn – csharpfolk Aug 05 '18 at 06:47

7 Answers7

6

Database snapshots!

Save this script and name it "create_db_snapshot.sql"

/* Create a database snapshot */

USE master;
CREATE DATABASE Your_Database_Snapshot ON
( 
    NAME = Your_Database, 
    FILENAME = 'C:\Snapshots\Your_Database_Snapshot.ss' 
)
AS SNAPSHOT OF Your_Database;
GO

Also, save this script and name it "restore_db_from_snapshot.sql"

USE master;
RESTORE DATABASE Your_Database from 
DATABASE_SNAPSHOT = 'Your_Database_Snapshot';
GO

Example:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using Microsoft.SqlServer.Management.Smo;
using Microsoft.SqlServer.Management.Common;
using System.IO;
using System.Data.SqlClient;

[SetUp]
public void TestSetUp()
{
    string sqlConnectionString = @"server=test.database.com;uid=your_db_username;pwd=your_db_password;database=Your_Database;";

    string script = File.ReadAllText(@"~/create_db_snapshot.sql");
    SqlConnection conn = new SqlConnection(sqlConnectionString);

    Server server = new Server(new ServerConnection(conn));
    server.ConnectionContext.ExecuteNonQuery(script);
}


[TearDown]
public void TearDown()
{
    string sqlConnectionString = @"server=test.database.com;uid=your_db_username;pwd=your_db_password;database=Your_Database;";

    string script = File.ReadAllText(@"~/restore_db_from_snapshot.sql");
    SqlConnection conn = new SqlConnection(sqlConnectionString);

    Server server = new Server(new ServerConnection(conn));
    server.ConnectionContext.ExecuteNonQuery(script);
}

Snapshot documentation: https://msdn.microsoft.com/en-us/library/ms175158.aspx

Code credit for executing .sql file: https://stackoverflow.com/a/1728859/3038677

You might also need to run this script prior to executing restore_db_from_snapshot.sql...

/* Kill all current connections to Your_Database */

use master;
DECLARE @kill varchar(8000) = '';
SELECT @kill = @kill + 'kill ' + CONVERT(varchar(5), spid) + ';'
FROM master..sysprocesses 
WHERE dbid = db_id('Your_Database')
Community
  • 1
  • 1
mycargus
  • 2,538
  • 3
  • 23
  • 31
  • 3
    Simply put the database into single user mode before restoring. That way you don't need to kill anything. – usr Apr 24 '15 at 16:31
  • From what I understand this will prevent tests from running concurrently. That is why i prefer approach with `start tx -> test -> flush/save changes -> rollback tx`. – csharpfolk Aug 05 '18 at 06:48
2

I use database snapshots for Selenium testing of my web application. In the Setup method I roll back the database to a snapshot taken just after restoring the database from production. This guarantees the database is in the same state for each test run, but still allows you to test the UI. I have created several temporary stored procedures, for example #usp_restore_snapshot, to avoid littering the database and the unit test with SQL code just used for testing..

Alex
  • 21,273
  • 10
  • 61
  • 73
  • Thanks for your answer. How would you exactly do this for the full test suite? I have about more than 50 test class with 50+ [TestFixture] setup – Saifur Apr 08 '15 at 23:22
  • You could define a base class with a [TestSetup] that roles back to the snapshot. Your test classes can then inherit this base class. This will work in NUnit 2.5 and higher. The Setup method in the base class will be executed before the Setup of the test class itself. – Alex Apr 08 '15 at 23:35
1

One way is to use https://msdn.microsoft.com/en-us/library/system.transactions.transactionscope.aspx and it will rollback everything back automatically, even on test failure, because there is no commit statement.

like:

[Test]
public void SomeTest() 
{
    using (TransactionScope scope = new TransactionScope())
    {
        // here comes your test
    }
}

A "better" and "safer" way is the way you told in your question doing it via TearDown:

[TestFixture]
public class YourFixture
{
    private TransactionScope scope;

    [SetUp]
    public void TestSetUp()
    {
        scope = new TransactionScope();
    }

    [TearDown]
    public void TearDown()
    {
        scope.Dispose();
    }


    [Test]
    public void SomeTest()
    {
        // here comes your test 
    }
}

Why? Because NUnit is your garanty that TearDown will be called.

Stan E
  • 3,396
  • 20
  • 31
  • Thanks for your answer! I tried using `TransactionScope `. But, the issue is I instantiate the TransactionScope , then execute a delete statement(without commit), then I tried to run some UI tests. But, query does not effect the UI since it was not committed. I do not have problem with committing the transaction as long as I can rollback the it's consequences. – Saifur Apr 06 '15 at 17:42
  • Ah, then the answer is very simple, you should implement some API for cleaning the "user" or environemnt and call it in TearDown, I guess it will be the simpliest solution.. In my previous project we had a big pool of users and cleaning users after each test. The thing is it's hard to determine what transations were called before while testing UI, TransactionScope or similar things will not definetely work. – Stan E Apr 06 '15 at 17:48
  • Thanks again. It's probably is. But, I did not want to introduce another layer of API just for Selenium Testing which is another layer of maintenance. That's the only reason – Saifur Apr 06 '15 at 17:50
  • 1
    Cleaning test environment usually is the biggest architecture problem in UI testing. That's why you should determine how you will do it. The cleaniest way is implementing API for using internal functions :) But everybody is deciding depending on his app and opportunities. – Stan E Apr 06 '15 at 18:00
  • 1
    One thing I like doing is copying and attaching/detaching the database file after a test etc. Of course my preconfigured database files are quite small even with data. – alas Apr 06 '15 at 19:24
  • @alas that's definitely a reason, to make a clean environment each time before test is starting. It will work in a lot of circumstances, but in some cases you will need to test the regression of UI and UI logic, looking what was before the patches and what has happened after with already existed data (in fact more complicated and complex your application is, less productive this variant will be). But for the small-mid applications it could be a perfect solution :) – Stan E Apr 07 '15 at 08:34
1

If you can guarantee that you only have one thread through the database, TearDown could just restore the known good backup of that DB. I suppose that for very large databases, this could be cumbersome.

Chris Steele
  • 1,343
  • 1
  • 9
  • 20
  • How fast is that restore? My database is usually more than 100 g – Saifur Apr 08 '15 at 23:23
  • Ours is maybe 2 gigs total... takes seconds. Just make sure both locations are on the same disk and use SSD if possible. Alternatively, you can try putting the folders on a RAM disk. I don't think the copy even at 100 gigs will be too bad. – Brett Green Apr 09 '15 at 17:42
1

We've done it one of two ways:

  1. Detach/Reattach the original database on teardown... that sounds really expensive, but if there's a lot to rollback it can be cheaper. Does require some plumbing to keep a copy of the original db files, and copy them around, etc.
  2. Start DTC transaction in Setup and dispose on Teardown... this works ok if the test setup is relatively light.

To make #1 even faster, we experimented with RAM disks to make disk copies lightning fast. This can make a huge difference if you go that route. The DTC transaction scope is the most natural way to do this, but if you're creating 1000s of records in setup and then rolling back, your tests can get pretty slow and that's not a good thing.

Some example code for attach/detach:

public void Reset()
{
if (!this.initialized || !this.connectionString.Contains("(local)"))
    return;
TestDbManager.CopyNewFiles(this.remoteDatabaseSourceFolder, this.localDatabaseFilesCacheFolder);
this.Detach(this.database);
TestDbManager.CopyNewFiles(this.localDatabaseFilesCacheFolder, this.localSqlServerWorkingFolder);
this.ReAttach(this.database, this.localSqlServerWorkingFolder);
}

So you have to keep track of folder for (a) original db files and (b) live db files. After each test you detach, copy from a to b, then attach.

Attach/Detach is done with simple commands...

exec sp_attach_db @dbname = '{0}'"
exec sp_detach_db @dbname = '{0}'"

We've got a little helper class around all this since we work with multiple databases to encapsulate it all.

Brett Green
  • 3,535
  • 1
  • 22
  • 29
  • 1
    For #1... you need a local copy of your MDFs/LDFs and a process that, on teardown, detaches the database(s), copies from that location to a 'live' location, and then reattaches them. We typically do this in a base test class that all tests derive some so it's transparent... I can post some code. – Brett Green Apr 09 '15 at 15:57
  • 1
    Added some sample code in my answer above... should give you the general idea. Like I said, you'll need some infrastructure around that approach but it's not that complicated in the end. – Brett Green Apr 09 '15 at 16:05
  • No need to detach. Set the database offline, overwrite the files and take it online. – usr Apr 24 '15 at 16:32
1

Having very large or long running transactions could hide or create bugs and could cause other unwanted side affects.

As said before , SNAPSHOT (Which I voted up :)

Create a SNAPSHOT at the START, REVERT back to it at the END

OR... can you just create a snapshot when the DB changes , eg when YOU need to and then just keep reverting. A snapshot is like a VERY lightweight backup in this case.

CREATE DATABASE database_snapshot_name    
    ON     
    (    
        NAME = logical_file_name,    
        FILENAME = 'os_file_name'     
    ) [ ,...n ]     
    AS SNAPSHOT OF source_database_name

To revert back to SNAPSHOT

USE master;
-- Reverting AdventureWorks to AdventureWorks_dbss1800
RESTORE DATABASE AdventureWorks from 
DATABASE_SNAPSHOT = 'AdventureWorks_dbss1800';
GO

both from MSDN

https://msdn.microsoft.com/en-us/library/ms189281.aspx

https://msdn.microsoft.com/en-us/library/ms175876.aspx

Steve Drake
  • 1,968
  • 2
  • 19
  • 41
-2

I Don't know if you are struck with C# code or SQL code. There is a way to do it in SQL. You begin transaction, update data and read uncommitted data and then rollback.

Begin Tran
Update table1 
set col1=val2
where col1=val1

select col1 from table1 WITH (NOLOCK);

rollback
  • I am using *NUnit* and wanted to split this concept into two parts and do the rollback in *[TearDown]* – Saifur Apr 03 '15 at 19:21