3

I have an integration testing solution. I have my tests described in XML files. In order to capitalize on Visual Studio 2010 testing infrastructure, I have a C# class where every XML test file has an associated method that loads the XML file and executes its content. It looks like this:

[TestClass]
public class SampleTests
{

    [TestMethod]
    public void Test1()
    {
        XamlTestManager.ConductTest();
    }

    [TestMethod]
    public void Test2()
    {
        XamlTestManager.ConductTest();
    }

    ...

    [TestMethod]
    public void TestN()
    {
        XamlTestManager.ConductTest();
    }
}

Each method name corresponds to an XML file name. Hence, I have to have the following files in my test directory:

  • Test1.xml
  • Test2.xml
  • ...
  • TestN.xml

XamlTestManager.ConductTest() uses the StackTrace class to get the name of the calling method and this way it can find the correct XML test file to load.

I would like to get rid of the extra administration of adding/removing/renaming test methods whenever I change my tests, adding/removing/renaming an XML test file. How can I automagically generate this class or its methods during the compilation process based on the actual XML files in my test directory?

Option 1: I have considered PostSharp, but it does not allow me to look up the XML files and generate methods on the fly (or was I superficial?).
Option 2: The other idea was to build a Visual Studio custom tool that generates my code whenever it is executed. The downside here is the deployment. The custom tool needs to be registered to VS. I want a solution that can be committed into a repository, check it out to another computer and use it right away. (I believe in simplicity. "Check out and run" just simplifies the life of new developers soooooo much, if they do not need to go through a list of thing to install before they can compile run the application.)

Do you have any recommendation, how to get rid of the unnecessary maintenance issue?

EDIT:
For the request of Justin, I add more details. We use Bizunit (fantastic!!!) as the basis of our framework with a truckload of custom made high level test steps. From these steps we can build our test like from lego blocks in a declarative manner. Our steps include like FileDrop, WebService invokation or even polling, firing up a full blown web server to simulate a partner web application, random data generator, data comparing steps etc. Here is an example test xml (in fact XAML):

<TestCase BizUnitVersion="4.0.154.0" Name="StackOverflowSample" xmlns="clr-namespace:BizUnit.Xaml;assembly=BizUnit" xmlns:nib="clr-namespace:MyCompany.IntegrationTest;assembly=BizUnit.MyCustomSteps" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml">
  <TestCase.SetupSteps>
    <nib:ClearStep FailOnError="True" RunConcurrently="False" />
    <nib:LaunchSimulatedApp AppKernelCacheKey="provider" FailOnError="True" FireWakeUpCall="False" PortNumber="4000" RepresentedSystem="MyProviderService" RunConcurrently="False" />
    <nib:HttpGetStep FailOnError="True" RunConcurrently="False" Url="http://localhost:10000/Home/StartSvgPolling">
      <nib:HttpGetStep.Parameters>
        <x:String x:Key="PolledAddress">http://localhost:4000/SvgOutputPort.asmx</x:String>
        <x:String x:Key="PollingInterval">10</x:String>
        <x:String x:Key="FilterFile"></x:String>
      </nib:HttpGetStep.Parameters>
    </nib:HttpGetStep>
  </TestCase.SetupSteps>

  <TestCase.ExecutionSteps>
    <nib:DocumentMergeStep FailOnError="True" OutputCacheKey="inputDocument" RunConcurrently="False">
      <nib:DocumentMergeStep.InputDocuments>
        <nib:RandomLoader BoundingBox="Europe" LinkbackUrlPattern="http://MyProviderService/id={0}" MaxAmount="10" MaxID="100" MinAmount="10" MinID="0" NamePattern="EuropeanObject_{0}" NativeFormat="Svg" RepeatableRandomness="False" UriPrefix="European" />
        <nib:RandomLoader BoundingBox="PacificIslands" LinkbackUrlPattern="http://MyProviderService/id={0}" MaxAmount="10" MaxID="100" MinAmount="10" MinID="0" NamePattern="PacificObject_{0}" NativeFormat="Svg" RepeatableRandomness="False" UriPrefix="Pacific" />
      </nib:DocumentMergeStep.InputDocuments>
    </nib:DocumentMergeStep>
    <nib:PushToSimulatedApp AppKernelCacheKey="provider" ContentFormat="Svg" FailOnError="True" RunConcurrently="False">
      <nib:PushToSimulatedApp.InputDocument>
        <nib:CacheLoader SourceCacheKey="inputDocument" />
      </nib:PushToSimulatedApp.InputDocument>
    </nib:PushToSimulatedApp>
    <nib:GeoFilterStep FailOnError="True" OutputCacheKey="filteredDocument" RunConcurrently="False" SelectionBox="Europe">
      <nib:GeoFilterStep.InputDocument>
        <nib:CacheLoader SourceCacheKey="inputDocument" />
      </nib:GeoFilterStep.InputDocument>
    </nib:GeoFilterStep>
    <nib:DeepCompareStep DepthOfComparision="ID, Geo_2MeterAccuracy, PropertyBag, LinkbackUrl" FailOnError="True" RunConcurrently="False" Timeout="30000" TolerateAdditionalItems="False">
      <nib:DeepCompareStep.ReferenceSource>
        <nib:CacheLoader SourceCacheKey="filteredDocument" />
      </nib:DeepCompareStep.ReferenceSource>
      <nib:DeepCompareStep.InvestigatedSource>
        <nib:SvgWebServiceLoader GeoFilter="Europe" NvgServiceUrl="http://localhost:10000/SvgOutputPort.asmx"/>
      </nib:DeepCompareStep.InvestigatedSource>
    </nib:DeepCompareStep>
  </TestCase.ExecutionSteps>

  <TestCase.CleanupSteps>
    <nib:HttpGetStep FailOnError="True" RunConcurrently="False" Url="http://localhost:10000/Home/StopSvgPolling">
      <nib:HttpGetStep.Parameters>
        <x:String x:Key="PolledAddress">http://localhost:4000/SvgOutputPort.asmx</x:String>
      </nib:HttpGetStep.Parameters>
    </nib:HttpGetStep>
    <nib:KillSimulatedApp AppKernelCacheKey="provider" FailOnError="True" PortNumber="4000" RunConcurrently="False" />
  </TestCase.CleanupSteps>
</TestCase>

This is what it does:

  1. Invokes a Clear operation on the test subject
  2. Launches a webserver on port 4000 as a simulated partner app under the name MyProviderService
  3. Invokes the test subject via HTTP Get to poll the simulated partner
  4. Creates a new document containing geo info from two random generated content
  5. Pushes the document to the simulated partner - hence the test subject will pick it up via polling
  6. The test applies a geo filter on the document
  7. The deep compare step loads the filtered document as base of comparision, and loads the content of the test subject via a web service
  8. As clean-up, it stops the polling via an HTTP GET step and kills the simulated partner's web server.

The power of Bizunit is that merges the ease of creating tests in C# with intellisense and ease of maintaining/duplicating it in XAML files. For a quick easy read on how it works: http://kevinsmi.wordpress.com/2011/03/22/bizunit-4-0-overview/

user256890
  • 3,396
  • 5
  • 28
  • 45
  • 1
    Maybe look into [T4 templates](http://msdn.microsoft.com/en-us/library/bb126445.aspx). I'm not familiar enough to proved a working example. Their small example looks like the start of exactly what you're after. – George Duckett Dec 15 '11 at 16:04
  • I would be really curious to see an example of this xml... intuitively though I'm going to say that expressing tests as XML instead of code is probably not a good idea but I will reserve judgement until I see the XML. In XUnit you can use the [Theory] attribute to pass different inputs into tests, I think you'd be much better of using an approach like that. – justin.m.chase Dec 15 '11 at 16:31
  • Do you happen to have schema of that (or was it generic XAML/other schema)? I added on separate answer an example that fits on this XML as well. Just becomes easier if you have clean/solid schema as-is; but one can be reverse-engineered as well (as I did on the other answer that is linked) – Kallex Dec 22 '11 at 22:36

4 Answers4

1

Instead of creating a separate test for each set of test data you can create a single test that is repeatedly run for each set of test data:

[TestClass]
public class SampleTests
{
    [TestMethod]
    public void Test()
    {
        for (var i = 0; i < 10; ++i)
            XamlTestManager.ConductTest(i); 
    }
}

You can also perform data-driven tests by using the DataSource attribute. This will perform your test for each row in your data set.

[TestClass]
public class SampleTests
{
    public TestContext Context { get; set; }

    [TestMethod]
    [DataSource(...)]
    public void Test()
    {
        var someData = Context.DataRow["SomeColumnName"].ToString();
        ...
    }
}
Martin Liversage
  • 104,481
  • 22
  • 209
  • 256
1

As @GeorgeDuckett said, T4 templates are probably the way to go. In the application I am working on, we use them for a lot, including generating Repositories, Services, ViewModels, Enums and recently unit tests.

They are basically code generating scripts written in either VB or C#, looking at a directory for XML files would be no problem for these kinds of templates.

If you do choose to go the T4 route, the Tangible T4 Editor is definitely a must have, it is a free download.

Here is a quick example of a T4 script which should do or be pretty close to what you want:

<#@ template language="C#" debug="true" hostspecific="true"#>
<#@ output extension="g.cs"#>
[TestClass]
public class SampleTests
{
<#
string[] files = Directory.GetFiles(@"C:\TestFiles", "*.xml");
foreach(string filePath in files)
{
    string fileName = Path.GetFileNameWithoutExtension(filePath);
#>
    [TestMethod]
    public void <#=fileName#>()
    {
        XamlTestManager.ConductTest();
    }
<#
}
#>
}

Make sure this is placed in a file with the .tt extension, then on the property windows for this file, ensure the Build Action is None, Custom Tool is TextTemplatingFileGenerator.

Edit: Accessing output directory from T4 template

Add the following two lines to the top of your T4 template, under the <#@ template ... #> line:

<#@ assembly name="EnvDTE" #>
<#@ import namespace="EnvDTE" #>

Then inside your template, you can access and use the visual studio API like so:

IServiceProvider serviceProvider = this.Host as IServiceProvider;
DTE dte = serviceProvider.GetService(typeof(DTE)) as DTE;
object[] activeSolutionProjects = dte.ActiveSolutionProjects as object[];

if(activeSolutionProjects != null)
{
    Project project = activeSolutionProjects[0] as Project;
    if(project != null)
    {
        Properties projectProperties = project.Properties;
        Properties configurationProperties = project.ConfigurationManager.ActiveConfiguration.Properties;
        string projectDirectory = Path.GetDirectoryName(project.FullName);  
        string outputPath = configurationProperties.Item("OutputPath").Value.ToString();
        string outputFile = projectProperties.Item("OutputFileName").Value.ToString();

        string outDir = Path.Combine(projectDirectory, outputPath);
        string targetPath = Path.Combine(outDir, outputFile);
    }
}

outDir and targetPath contain the output directory and the full path to the output file.

Lukazoid
  • 19,016
  • 3
  • 62
  • 85
  • This looks fantastic! Is there a way to refer solution folders similarly to the build events, like: $(OutDir) or $(TargetPath)? – user256890 Dec 16 '11 at 09:59
  • @user256890 Thanks for spotting the typo, corrected now. You can't refer to solution folders using those handy macros. However you do have access to the Visual Studio API. I'll update my answer with an example – Lukazoid Dec 16 '11 at 10:29
  • Getting solution files: http://stackoverflow.com/questions/1352570/get-project-or-relative-directory-with-t4 – user256890 Dec 16 '11 at 10:39
  • @user256890 Yeh that works for getting the solution file, however I'm just looking into getting the actual output directory and output file :) – Lukazoid Dec 16 '11 at 10:48
  • I struggle getting the transformation done before each build. MSDN recommends to install VS Visualization and Modeling SDK (http://msdn.microsoft.com/en-us/library/ee847423.aspx), but I am quite reluctant to do that - it will not work on other computers after plain check-out. – user256890 Dec 16 '11 at 10:58
  • @user256890 I've updated my answer to include what you were after. I'm not quite sure other than through using the SDK to ensure how the template is scheduled to be transformed before every build. – Lukazoid Dec 16 '11 at 11:04
  • Invoking transformation on build: http://stackoverflow.com/questions/1646580/get-visual-studio-to-run-a-t4-template-on-every-build – user256890 Dec 16 '11 at 14:17
0

I actually don't think this is a job for build-time code generation, I think you should use data attributes to drive the tests in this case.

If you used xunit you could do that like this:

public class SampleTests
{
    [Theory]
    [InlineData(1)]
    [InlineData(2)]
    [InlineData(...)]
    [InlineData(N)]
    public void Test(int x)
    {
        XamlTestManager.ConductTest(x);
    }
}

And it will run the test once per InlineData attribute. Also I believe there is another attribute that you can pass a path to a file and it will populate your parameters with values from that file...

I think NUnit has a similar feature but XUnit is much better, I would recommend using XUnit instead.

justin.m.chase
  • 13,061
  • 8
  • 52
  • 100
  • Justin, thanx for your thoughts. The XML file does not contain test data but the test scenario! I have expanded the question with a sample XML file and its explanation as you asked for. If you are still interested... – user256890 Dec 16 '11 at 14:19
0

Just answered alike "code generation from XML with T4" question.

https://stackoverflow.com/a/8554949/753110

Your requirement matches exactly what we did initially (and what lead to discovery of the ADM described on that answer).

We are currently working on test-case based generation, where the test-cases are actually built by the testing-staff, yet the complete integrationtests through code are generated to support them.

Added custom XML based generation demo for that other example, if you want to see:

https://github.com/abstractiondev/DemoSOCase8552428ABS

Community
  • 1
  • 1
Kallex
  • 498
  • 2
  • 10