I am writing a unit test for a function which tests a time value. The function appears to work correctly when I test it manually, but it fails when I test it using Xcode's Unit Test functionality. I have investigated, and the problem appears to be with the Unit Test arithmetic. This is the Unit Test code:
- (void)testValidTime {
for (double a = 0.0f; a < 25.0f; a++) {
for (double b = 0.0f; b < 0.65f; b += 0.01f) {
double c = a + b;
if (a > 23.0f || b > 0.59f) {
XCTAssertFalse([TimeClass timeIsValid:c], "Test failed - Time passed as valid %f",c);
} else {
XCTAssertTrue([TimeClass timeIsValid:c], "Time failed - Time failed as invalid %f",c);
}
}
}
}
Interestingly, if I step through the code, the result of the addition (c) is formatted as the correct answer in the error message (0.590000, for example) - but if I examine what c actually contains it shows as 0.589999974. And if I can't trust the debugger (or the Test functionality) what can I trust!?
Any ideas gratefully received.