-1

So I have something like so:

class MyClass {
    private myVar;

    public MyClass(someValue) {
        // Performs some operation to determine myVar
    }

    public double calculateThing() {
        // Returns some arithmetic operation on myVar
    }

    public double calculateOtherThing() {
        newVar = calculateThing();
        // Returns some arithmetic operation on newVar
    }
}

calculateThingUnitTest() {
    MyClass class = new MyClass(someValue);

    Assert::AreEqual(someConstant, class.CalculateThing());
}

calculateOtherThingUnitTest() {
    MyClass class = new MyClass(someValue);

    Assert::AreEqual(someOtherConstant, class.CalculateOtherThing());
}

It is clear that calculateThingUnitTest is a proper unit test because it initializes a class, gives it some literal-defined independent value in the constructor, and the assertion it makes is based only on calculateThing(), so it tests one "unit" of the application.

However, calculateOtherThingUnitTest calls calculateOtherThing which then makes a call to calculateThing to determine if the result is correct. So, if calculateThing ends up failing, calculateOtherThing will fail as well.

In order to prevent this, you would want to put some mock value in place of calculateThing, and you could achieve this by inserting a new member variable into MyClass and making calculateThing load into that member variable. Then before calling calculateOtherThing you could just insert a literal value into this member variable and you would have a proper unit test.

Adding a new member variable and exposing it to the public seems extraordinarily excessive, however. Is there a better way to achieve this?

Note: Although I am using pseudocode, I am working in C# with Visual Studio .Net Unit Test Project.

Shane Duffy
  • 1,117
  • 8
  • 18
  • Well it isn't really a problem, more so it seems like it would be "bad practice" in the realm of unit testing (which I am fairly new to), so I'm curious about what strategy you would implement to make it "good practice." Also in my specific case, it would benefit the overall design to keep the method as non-static. I guess I'm asking what would be the "proper" way to implement unit testing in this, assuming the method is non-static. – Shane Duffy Jun 27 '19 at 20:31
  • 1
    The difficulty in isolating that target function should be an indicator that the function has too many responsibilities, which is a (SRP - Single Responsibility Principle) violation as it relates to SOLID code. – Nkosi Jun 27 '19 at 20:43
  • 1
    unit testing poor code does not change the fact that the code is poorly designed. it should however help identify the poor code so that it can be refactored accordingly. – Nkosi Jun 27 '19 at 20:46
  • Read up on SOLID principles. You may be misinterpreting my explanation. – Nkosi Jun 27 '19 at 20:55

1 Answers1

1

The thing you are looking for is partial mocks. You need to mock the method that is not under test. And you need to leave the method under test as it is with default implementation. So, when you call the method under test, all other methods are not called as they are mocked.

So you could mock calculateThing, and then just test other methods as it is.

Stackoverflow link

Shahzad
  • 1,677
  • 1
  • 12
  • 25
  • This is EXACTLY what I was looking for! Thanks! – Shane Duffy Jun 27 '19 at 20:59
  • I had actually just come up with a solution on my own: creating an extended class which overrides the return value of the inner functions. Which is exactly what this happens to achieve (but in a much cleaner way). – Shane Duffy Jun 27 '19 at 21:01
  • @ShaneDuffy Welcome. Your approach works too. Though I will be a little sceptical of changing the architecture too much for the sake of tests when we have good tools at our disposal. – Shahzad Jun 27 '19 at 21:03