1

I have to time how long a bubble sort takes and print how long it took. In my program the time printed is always 0.00 seconds. Can anyone tell me what I'm doing wrong?

int main()
{
    srand((unsigned)time(NULL));
    int arr[5000], arr2[5000]; 
    int i;
    time_t start, end;
    double timeDiff;

    for(i=0; i < 5000; i++)
    {
        arr[i] = rand() % 100 + 1;
        arr2[i] = arr[i];
    }

    cout << "Here is the initial array:" << endl;
    printArray(arr, 5000);

    time(&start);
    bubbleSort(arr, 5000);
    time(&end);
    timeDiff = difftime(end, start);

    cout << "\nHere is the array after a bubble sort:" << endl;
    printArray(arr, 5000);
    cout << fixed << setprecision(2) << "\nIt took " << timeDiff << " seconds to bubble sort the array." << endl;

    system("pause");
    return 0;
}
animuson
  • 53,861
  • 28
  • 137
  • 147
Joshua
  • 79
  • 1
  • 8
  • It sounds like the granularity of time_t is too large to capture the time taken by your sorting routine (i.e. `bubbleSort` executes in less than 1 second). You'll need to run the routine multiple times in order for it to take long enough to register. – John Bode Feb 11 '11 at 20:29
  • Note that bubble sort has no place in Computer Science other than as a illustration of how not to do things. – Jim Balter Feb 12 '11 at 05:05
  • this was an assignment for a class. we had to create an array of 5000 and fill it with numbers 1-100 and then run two different sort methods on it. – Joshua Feb 12 '11 at 18:02

3 Answers3

6

I think you need to use something that has a little more precision than difftime (which only reports in seconds):

See: Time difference in C++ for more information.

Community
  • 1
  • 1
jwir3
  • 6,019
  • 5
  • 47
  • 92
  • 1
    http://stackoverflow.com/questions/275004/c-timer-function-to-provide-time-in-nano-seconds is also worth looking at – Pete Kirkham Feb 11 '11 at 20:41
4

It executes faster than the cpu clock takes to update. What you have to do is execute your sort a couple of million times, time that, and divide the time by the number of iterations (making sure you use a double for the most precision you can get. So basically something like:

const int runs=1000000;
time(&start);

for(int r=0;r<runs;++r)
    bubbleSort(arr, 5000);

time(&end);
timeDiff = difftime(end, start);

double realduration=timeDiff/(double)runs;
Blindy
  • 65,249
  • 10
  • 91
  • 131
  • 2
    whilst true, with a timer precision of 1 second and a run time of just under a second ( adding a bubble sort implementation to the code above and running on my laptop ), running for 10 days to get an output with at accuracy of 10ms seems rather inefficient compared to using a better timer. Or run it for 100s, then you know that you've got your 0.01s precision. – Pete Kirkham Feb 11 '11 at 20:39
  • 3
    A problem with the loop as presented is that after the 1st iteration, the array is sorted. Thereafter, you are only timing the best case scenario of the bubblesort--an already sorted array. – Sparky Feb 11 '11 at 20:55
  • Due to what Sparky already pointed out, not only is this inefficient, it's incorrect. – jwir3 Feb 12 '11 at 06:39
0

5000 is to small to register you need to run the whole sorting process 100 or 1000 times then divide by that number to get some time

I was told that to get somewhat of a good time you need to have the program run for 5 to 10 sec

Tbone
  • 129
  • 1
  • 9