I have this test:
@Test
public void findsPackageClasses() throws ClassNotFoundException, IOException {
final long startTime = System.currentTimeMillis();
Class<?>[] classes = finder.getPackageClasses(ClassFinderStubsPackageMarker.class.getPackage());
Arrays.sort(classes, (c1, c2) -> c1.getName().compareTo(c2.getName()));
assertArrayEquals(STUB_CLASSES, classes);
System.out.println(String.format("Test duration: %d ms", System.currentTimeMillis() - startTime));
}
When I run it as a JUnit test in Eclipse, the JUnit report regularly displays 0.4-0.5 sec for the duration of this test. Infinitest also issues a Slow Test Warning for it. However, the duration calculation using currentTimeMillis()
always results 4-5 ms, printed to the console. What's going on?
I'd like to get rid of the warning by eliminating the slow part(s) - moving to integration test - but the speed reports seem contradictory. In addition, the only slow part could be finder.getPackageClasses()
but its file system dependency is mocked out with Mockito (set up to return class literals). I've even put temporarily code to throw exception when live code is hit (which I've also tested by temporarily removing the mock, this way the exception was effectively thrown in the live code without mocking), gaining confirmation that the mock works when it's injected. The other slow part might be getPackage() which I haven't investigated yet thoroughly (calculating delta with currentTimeMillis()
returns 0 ms for it). Anyway, the speed reports are permanently contradictory, which is very strange. Have you ever encountered a similar phenomenon?