2

For the given program I'm getting different results on Windows(VS 17) compare to linux machine(gcc 4.8).

#include "CrossDevelopment.h"
using namespace std;

int main()
{

    for (auto i = 0; i < 3; i++)
    {
//chrono::high_resolution_clock::time_point start_time = chrono::high_resolution_clock::now();
chrono::system_clock::time_point start_time = chrono::system_clock::now();

        for (auto i = 0; i < 50; i++) {
        int a = 10;
        int b = 5;
        int c = a + b;
                c += 10;
                c *= a;
                a *= b;

    }
//chrono::high_resolution_clock::time_point end_time = chrono::high_resolution_clock::now();
chrono::system_clock::time_point end_time = chrono::system_clock::now();
    auto elapsed_time = chrono::duration<double, micro>(end_time - start_time);
    

cout << "Difference of time " << elapsed_time.count() << " " << (end_time - start_time).count() 
<< " " << (chrono::duration_cast<chrono::nanoseconds>(end_time - start_time)).count() << endl;
}

getchar();
return 0;
}

Output On Windows machine

Difference of time 1 10 1000

Difference of time 0.7 7 700

Difference of time 0.7 7 700

On Linux machine

Difference of time 0.806 806 806

Difference of time 0.6 600 600

Difference of time 0.542 542 542


If you see the last columns you will observe the difference. Which is not in case of high_resolution_clock.

Community
  • 1
  • 1
Arun Pal
  • 687
  • 7
  • 28
  • 1
    Possible duplicate of [Correct way of portably timing code using C++11](https://stackoverflow.com/questions/42603590/correct-way-of-portably-timing-code-using-c11) – Superlokkus Mar 12 '19 at 12:26

1 Answers1

0

The precision of system_clock::time-point is not portable across platforms. But one can easily inspect it, and/or convert it to a known precision as you have done in your question.

The easiest way to inspect it is to use my date.h header:

#include "date/date.h"
#include <iostream>

int
main()
{
    using namespace std;
    using namespace std::chrono;
    using date::operator<<;
    auto start_time = system_clock::now();
    auto end_time = system_clock::now();
    cout << end_time - start_time << '\n';
}

On gcc this is going to output something like:

1730ns

On Windows:

17[1/10000000]s

On macOS:

1µs

Explanation:

On gcc, system_clock::time_point has nanosecond precision, on Windows, it has precision 1/10'000'000 (100ns), and on macOS, microsecond precision.

You can inspect the precision without date.h header by looking at system_clock::duration::period::num and system_clock::duration::period::den, which are the numerator and denominator of a fraction that specifies the length in fractions of a second that each tick represents (1 and 10'000'000 on Windows).

The ability to print out durations with their units (like date.h allows), is currently now in the C++20 specification.

Howard Hinnant
  • 206,506
  • 52
  • 449
  • 577
  • precision of system_clock::time-point is not portable across platforms. Why ? – Arun Pal Mar 13 '19 at 07:13
  • 1
    The design of `` is such that it allows you to get the very best out of your timing hardware, without it having to lie to you (advertise a precision finer than it can deliver), and without having to use an API that is written to the lowest common denominator (use a precision that is coarser than it can deliver). `` vendors can deliver whatever precision they feel would be most useful to their clients for a given platform. Clients can either take advantage of whatever (discoverable) precision `` delivers, or easily convert to any convenient precision they desire. – Howard Hinnant Mar 13 '19 at 13:35
  • 2
    Additionally, finer precision isn't free. There are three tuning knobs: size, range and precision. You can have any two, but at the cost of the third. The size of `system_clock::time_point` is typically fixed at 64 bits (it doesn't have to be, but that is what is practical on today's hardware). Given that size, precision and range are sacrificed for one another. nanoseconds -> +/- 292 years. 100ns -> +/-29.2 thousand years. microseconds -> +/-292 thousand years. – Howard Hinnant Mar 13 '19 at 14:12