I'm trying to measure and increase the execution time of a test case. What I'm doing is the following:
- Let's assume I'm testing the method abc() using testAbc().
- I'm using android studio and junit for my software development.
- At the very beginning I'm recording the timestamp in nano seconds in start variable, and then when the method finishes, it returns the difference between the current nano seconds and the start.
- testAbc() is divided into 3 parts: initialization, testing abc() and assertion (checking the test results).
- I keep track the test time inside testAbc() same way as I do in abc() method.
- after executing the test I found that abc() method takes about 45-50% of the test time.
- I modified the testAbc() as follows:
void testAbc(){
startTime = System.nanoTime();
//no modification to the initialization part
//testing abc part is modified by placing it in a for loop to increase its
//execution time.
for(int i=0 ; i < 100; i++)
{
//test abc code goes here ...
abcTime += abc();
}
//assertion part wasn't modified
testEndTime = System.nanoTime - startTime;
}
By repeating the test part I thought the ratio between abcTime and testEndTime will increase (by dramatically increasing abcTime), however, it didn't at all I mean it's 45-50%.
My questions:
- Why the ratio didn't increase? I mean in principle the execution time for the initialization and assertion parts should not be affected by the for loop and therefore the time for abc should get closer to the test time after 100 repetitions.
- How can I increase the ratio between abc() and testAbc()?
thank you for your time.