This is my very first question here and I'm also a complete newbie at C++, but I'll do my best to be as specific as possible. Please tell me if I'm being to vague:
I am trying to measure the time it takes for a sorting method (merge sort) to sort a given array of integers by using chrono and duration_cast. Here is the code snippet in question:
auto t1 = std::chrono::high_resolution_clock::now();
mergesort(sortingArray, temp, 0, num - 1);
auto t2 = std::chrono::high_resolution_clock::now();
std::chrono::duration<double, std::milli> fp_ms = t2 - t1;
std::cout << fp_ms.count() << " seconds\n";
And the output I get is always "0 seconds", no matter how big I make the array it needs to sort. Even when it sorts a million integers and there is a noticeable execution time, it still gives me the same output.
I'm basically following the example given here: http://en.cppreference.com/w/cpp/chrono/duration/duration_cast
Only instead of f() I'm using my mergesort function. How can I make it measure my sorting method properly?
EDIT: I'm using minGW to compile via Powershell in Windows 10. The command looks like this:
g++ -std=c++11 .\Merge.cpp