0

When creating Theory test cases with XUnit I would like to be able to include both the parameters and the expected outcome for each case. I have used the InlineData attribute but for heavy configuration loading this is less than optimal and does not permit reuse.

[InlineData(1,2,3,4,5,6,7,...)]

As such I have moved the test configurations out to a separate class and now load them with MemberData and MemberType.

[Theory]
[MemberData(nameof(DataClass.Data), MemberType = typeof(DataClass))]
public void TestValidConfig(Configuration config)
{
    ...
}

However this does not allow me to specify the expected outcome like i could if using a basic tag i.e.

[InlineData("Input1", "Input2", "Input3", "ExpectedResult")]

I don't want to include the expected outcome with the configuration data as this will be reused in multiple tests.

Has anyone got a solution to this challenge?

So the underlying challenge is having complex test data, that could be used in multiple places, but then wanting to separate the expected outcome. So in a calculator (bad example) you could have lists of numbers that are the test data. These could then be passed into an add or multiply or subtraction test. This is where I would want to separate the input and the expected output data.

Adam
  • 1,149
  • 10
  • 12
  • wasn't planning to aggressively close ;) Having said that, I reckon there is an answer to your puzzle in there - please let me know if your question is differnt and I'll be happy to reopen – Ruben Bartelink Mar 22 '19 at 16:19
  • Hi Ruben, Not entirely. I'm already using complex types, what I'm looking to do is reuse test data between tests rather than have to duplicate. Although this then needs different assertions and expected results. So the question is more around mixing complex config data with simple expected results. I've updated the last paragraph of my question to reflect this. – Adam Mar 25 '19 at 08:22
  • semi-related: https://stackoverflow.com/questions/22093843/pass-complex-parameters-to-theory (re-opened per the above) – Ruben Bartelink Mar 25 '19 at 09:06
  • in general, I'd make a memberdata that yields an object[], but internally break it down into generators for inputs + the result, i.e. `for (inputs,out) in expecteds -> yield appendarray( inputs, output)` – Ruben Bartelink Mar 25 '19 at 09:09
  • 1
    Why don't you just create a separate method like `DataClass.Test1` which internally calls `Data` method from a helper class and combine that with expected result for that particular test? This way you can reuse your `Data` while change the expected result per test. Or may be I just didn't get what you meant. – kovac Mar 25 '19 at 09:26
  • @swdon I think you're right, Do you want to answer that up and I'll mark this as resolved? – Adam Mar 28 '19 at 08:57

1 Answers1

1

Here's a suggestion:

  1. Create a class to generate test data:
    internal static class TestData
    {
        public static IList<T> Get<T>(int count = 10)
        {
            // I'm using NBuilder here to generate test data quickly.
            // Use your own logic to create your test data.
            return Builder<T>.CreateListOfSize(count).Build();
        }
    }

Now, all your test classes can leverage this to get the same set of test data. So, in your data class, you would do something along the lines of

public class DataClass
{
    public static IEnumerable<object[]> Data()
    {
         return new List<object[]>
                {
                    new object[] { TestData.Get(), this.ExpectedResult() }
                };
    }
} 

Now you can follow through with your original approach:

[Theory]
[MemberData(nameof(DataClass.Data), MemberType = typeof(DataClass))]
public void TestValidConfig(Data input, Configuration expected)
{
    ...
}

If your tests don't mutate the input data, You can collect them into a fixture and inject the input data through constructor. This would speed up the tests since you don't have to generate the input data per test. Check out shared context for more information.

kovac
  • 4,945
  • 9
  • 47
  • 90