1

I need times for

  • 1000
  • 5000
  • 10000
  • 15000
  • 20000 etc no. of datas.

    By using Quick Sort algorithm for verifying time-complexity from No. of Data vs Times graph. But I have still zero second time for 1000 data's and also 20000 data's. If I measured time in milli or nano second, but time is still zero. Is there any way to find approximate or comparative time for different No. of Data's?

My Quick Sort process are here -

#include <bits/stdc++.h>
using namespace std;

int A[50000], i, j, V, store;

int part(int left, int right, int P)
{
    V = A[P];
    swap(A[P], A[right]);
    store = left;
    for(i = left; i < right; i++)
    {
        if(A[i] <= V)
        {
            swap(A[i], A[store]);
            store++;
        }
    }
    swap(A[store], A[right]);

    return store;
}

void qSort(int left, int right)
{
    if(left < right)
    {
        j = part(left, right, left);
        qSort(left, j-1);
        qSort(j+1, right);
    }
}

main()
{
    int nData, k, minValue, maxValue;
    cout<<"No. of Data: ";
    cin>>nData;
    cout<<"\nRange (min, max): ";
    cin>>minValue>>maxValue;
    for(k=0; k<nData; k++)
    {
        A[k] = minValue + (rand()%(int) (maxValue - minValue + 1));
    }
    clock_t t1 = clock();
    qSort(0, nData-1);
    clock_t t2 = clock();
    cout<<"\n\nTime: "<<(double)(t2-t1)/CLOCKS_PER_SEC<<endl;    
}

[N.B: My operating System is Windows]

Ulrich Eckhardt
  • 16,572
  • 3
  • 28
  • 55
Jahidul Islam
  • 119
  • 1
  • 11
  • 2
    Possible duplicate of [Measuring execution time of a function in C++](http://stackoverflow.com/questions/22387586/measuring-execution-time-of-a-function-in-c) – Weak to Enuma Elish Feb 21 '16 at 06:51
  • [James Root](http://stackoverflow.com/users/4756309/james-root), It's so difficult to me get the answer from [Measuring execution time of a function in C++](http://stackoverflow.com/questions/22387586/measuring-execution-time-of-a-function-in-c) – Jahidul Islam Feb 21 '16 at 06:55
  • 1
    I don't see how. You can copy paste the accepted answer and just change the function call. Your other option is run the function many times and take the average, but it is less accurate. – Weak to Enuma Elish Feb 21 '16 at 06:57
  • ok, I will try, [James Root](http://stackoverflow.com/users/4756309/james-root). And thanks for suggest me. – Jahidul Islam Feb 21 '16 at 06:59
  • 1
    If you used a variable-sized container instead of an array and filled the data with random data, you could set the size on the commandline and it would make sure it's not optimized away. That said, please post your code to https://codereview.stackexchange.com, your code would benefit a lot. – Ulrich Eckhardt Feb 21 '16 at 09:55

2 Answers2

4
main()
{
    ...
    clock_t t1 = clock();
    qSort(0, nData-1);
    clock_t t2 = clock();
    cout<<"\n\nTime: "<<(double)(t2-t1)/CLOCKS_PER_SEC<<endl;    
}

The problem here is that the compiler is too smart for this kind of simple test. The compiler sees code that has no effect in the program, it optimizes the program by removing unnecessary code. You have to disable optimization (running the program in debug mode may do that) or modify the program so that the result of sort operation is used in some ways.

Also, clock() has different accuracy on Windows and POSIX systems. It's simpler to use std::chrono instead. For example

#include <iostream>
#include <chrono>

int main()
{
    std::chrono::time_point<std::chrono::system_clock> start, end;
    start = std::chrono::system_clock::now();
    qSort(0, nData-1);
    end = std::chrono::system_clock::now();

    std::chrono::duration<double> elapsed_seconds = end - start;
    std::cout << "count:" << elapsed_seconds.count() << "\n";

    return 0;
}
Barmak Shemirani
  • 30,904
  • 6
  • 40
  • 77
3

Just repeat the task to be measured in a for-loop with a sufficiently high number of iterations, and then divide your measured time by the number of iterations.

H. Guijt
  • 3,325
  • 11
  • 16