5

So I was trying to use std::chrono::high_resolution_clock to time how long something takes to executes. I figured that you can just find the difference between the start time and end time...

To check my approach works, I made the following program:

#include <iostream>
#include <chrono>
#include <vector>

void long_function();

int main()
{
    std::chrono::high_resolution_clock timer;
    auto start_time = timer.now();

    long_function();

    auto end_time = timer.now();
    auto diff_millis = std::chrono::duration_cast<std::chrono::duration<int, std::milli>>(end_time - start_time);

    std::cout << "It took " << diff_millis.count() << "ms" << std::endl;
    return 0;
}

void long_function()
{
    //Should take a while to execute.
    //This is calculating the first 100 million
    //fib numbers and storing them in a vector.
    //Well, it doesn't actually, because it
    //overflows very quickly, but the point is it
    //should take a few seconds to execute.
    std::vector<unsigned long> numbers;
    numbers.push_back(1);
    numbers.push_back(1);
    for(int i = 2; i < 100000000; i++)
    {
        numbers.push_back(numbers[i-2] + numbers[i-1]);
    }
}

The problem is, it just outputs 3000ms exactly, when it clearly wasn't actually that.

On shorter problems, it just outputs 0ms... What am I doing wrong?

EDIT: If it's of any use, I'm using the GNU GCC compiler with -std=c++0x flag on

ildjarn
  • 62,044
  • 9
  • 127
  • 211
Bertie Wheen
  • 1,215
  • 1
  • 13
  • 20
  • On my windows box it reports accurate times using both MSVC 17.00.50727.1 (VSExpress2012) and GCC 4.8.0 20120924 – sehe Nov 27 '12 at 23:40
  • 4
    Nitpick: prefer `typedef std::chrono::high_resolution_clock timer; auto start_time = timer::now();` as now() is static member – sehe Nov 27 '12 at 23:40
  • I don't see anything wrong with the code, for shorter times, you probably just need to change the period to `std::nano`, etc. – Jesse Good Nov 27 '12 at 23:43
  • Oh ok. It's just that it was outputting _exactly_ 3000ms, every time. – Bertie Wheen Nov 27 '12 at 23:46
  • Sorry for being stupid, but am struggling to find out the version. – Bertie Wheen Nov 27 '12 at 23:55
  • 2
    Well, the actual tick period of the timer is unspecified. You're converting it to milliseconds (BTW you could just use the predefined duration `std::chrono::milliseconds`) but it could be anything. If the period were seconds then getting an exact multiple of 1000 would be expected. (of course high_resolution_clock would hopefully use a smaller period and IIRC libstdc++ uses nanoseconds...) – bames53 Nov 28 '12 at 00:00
  • to get the version just use the flag `-v`. – bames53 Nov 28 '12 at 00:01
  • For me (GCC 4.7.2, Linux 3.6.7) it works fine and reports reasonable amounts of time. However, I wonder why your duration cast is so complicated. You can simply say `std::chrono::duration_cast(end_time - start_time)`. – jogojapan Nov 28 '12 at 01:24
  • 1
    In fact, the duration-cast you currently use _can_ have an impact on the times you get because you cast to `duration`, which means the type used for the internal time representation becomes `int`. On platforms where `int` is small, this may cause problems. If you really want to choose your own representation type, `intmax_t` would be a better choice. – jogojapan Nov 28 '12 at 01:44

2 Answers2

2

The resolution of the high_resolution_clock depends on the platform.

Printing the following will give you an idea of the resolution of the implementation you use

    std::cout << "It took " << std::chrono::nanoseconds(end_time - start_time).count() << std::endl;
Vicente Botet Escriba
  • 4,305
  • 1
  • 25
  • 39
1

I have got a similar problem with g++ (rev5, Built by MinGW-W64 project) 4.8.1 under window7.

int main()
{
    auto start_time = std::chrono::high_resolution_clock::now();
    int temp(1);
    const int n(1e7);
    for (int i = 0; i < n; i++)
        temp += temp;
    auto end_time = std::chrono::high_resolution_clock::now();
    std::cout << std::chrono::duration_cast<std::chrono::nanoseconds>(end_time - start_time).count() << " ns.";
    return 0;
}

if n=1e7 it displays 19999800 ns but if n=1e6 it displays 0 ns.

the precision seems weak.

Renaud
  • 209
  • 2
  • 9
  • You should send the 'temp' variable to std::cout too, otherwise, the compiler might just eliminate it together with the whole loop, cleverly seeing you're not doing anything with it. – Johan Boulé Apr 05 '15 at 18:54
  • Maybe mingw's high resolution clock wasn't actually high resolution on that version. It looks like something in the order of one or two milliseconds (with n=1e6) might end up as zero. Perhaps repeating the measure would have shown "bursts" e.g. at 10 or 15ms, indicating a low resolution of about 10-15ms (the default "HZ" value for windows, in linux parlance) – Johan Boulé Apr 05 '15 at 19:02