119

What's the best way to calculate a time difference in C++? I'm timing the execution speed of a program, so I'm interested in milliseconds. Better yet, seconds.milliseconds..

The accepted answer works, but needs to include ctime or time.h as noted in the comments.

Trevor Boyd Smith
  • 18,164
  • 32
  • 127
  • 177
Jack BeNimble
  • 35,733
  • 41
  • 130
  • 213
  • Dupe, but can't link right now – Robert Gould Apr 08 '09 at 01:37
  • 2
    The vote to close was too little, too late. I got a working answer. Nice try though. Btw, I couldn't find a link either. – Jack BeNimble Apr 08 '09 at 02:35
  • Its for windows? then try GetTickCount (Windows API) – aJ. Apr 08 '09 at 04:06
  • added the dupe link now, but yes Jack BeNimble you beat me to the candle stick :) – Robert Gould Apr 08 '09 at 06:08
  • 6
    Robert: Luckily, because the new posting allowed several more answers, one of which I selected. Seriously, I question the value of closing a dup post. What if some solutions weren't mentioned in the first one? New technologies developed? Can't find it because of different headings? – Jack BeNimble Apr 08 '09 at 12:45
  • 2
    @JackBeNimble having been on the receiving end of a few "dups" that weren't exactly dups (maybe people who perhaps quickly read the question and mark it because it sounds similar to another question), I strongly agree with your point... probably a point for meta stack exchange :o – code_fodder Nov 16 '15 at 11:10
  • @code_fodder - Omg, exactly. – Jack BeNimble Nov 17 '15 at 13:14

14 Answers14

135

See std::clock() function.

const clock_t begin_time = clock();
// do something
std::cout << float( clock () - begin_time ) /  CLOCKS_PER_SEC;

If you want calculate execution time for self ( not for user ), it is better to do this in clock ticks ( not seconds ).

EDIT:
responsible header files - <ctime> or <time.h>

Diego Sevilla
  • 28,636
  • 4
  • 59
  • 87
bayda
  • 13,365
  • 8
  • 39
  • 48
  • When I run this code, a get a huge number, no idea what it means. I'd like something that says "9.334 seconds". – Jack BeNimble Apr 08 '09 at 00:43
  • Don't forget to #include or appropriately so that clock() gets an appropriate prototype and clock_t is defined. – RBerteig Apr 08 '09 at 01:23
  • 4
    Keep in mind that even though clock() returns a number of milliseconds, the precision of clock() can be much worse than that. Fir instance in Windows the precision of the clock() function is something like 40 ms. – John Dibling Apr 08 '09 at 02:59
  • 2
    I tried this on Mac 10.7 . my app executes a 100 mb file in 15 seconds, but the diff time is reporting 61 seconds. Not much use. I think time() is probably better. – Miek Sep 23 '13 at 22:33
  • 11
    `clock()` returns the CPU time consumed by the program. So if the program is run in parallel the time returned by the function would be the accumulated of the time spent on all CPUs, rather than the time elapsed http://www.cplusplus.com/reference/ctime/clock/ – Ameer Jewdaki Aug 03 '17 at 09:43
  • 8
    This answer is misleading because it shows the CPU time, not the actual wall clock time. – Sumsuddin Shojib May 07 '18 at 06:30
  • 1
    I have to agree with Ultraviolet here, using CPU time to measure speed of a program seems like the wrong thing to do. OP should unmark this as the right answer. IMO you should use std::chrono::steady_clock::now() as described by multiple answers in the following thread https://stackoverflow.com/questions/2808398/easily-measure-elapsed-time – Gr-Disarray Dec 26 '19 at 23:12
67

I added this answer to clarify that the accepted answer shows CPU time which may not be the time you want. Because according to the reference, there are CPU time and wall clock time. Wall clock time is the time which shows the actual elapsed time regardless of any other conditions like CPU shared by other processes. For example, I used multiple processors to do a certain task and the CPU time was high 18s where it actually took 2s in actual wall clock time.

To get the actual time you do,

#include <chrono>

auto t_start = std::chrono::high_resolution_clock::now();
// the work...
auto t_end = std::chrono::high_resolution_clock::now();

double elapsed_time_ms = std::chrono::duration<double, std::milli>(t_end-t_start).count();
Sumsuddin Shojib
  • 3,583
  • 3
  • 26
  • 45
  • 3
    Note this assumes the system clock doesn't change. If you're writing code to handle all circumstances you need to consider summer time, leap seconds, time syncing with NTP, etc. – parsley72 Nov 25 '20 at 01:30
  • 1
    @parsley72 Not if you use `std::chrono::steady_clock`, which is guaranteed to be monotonic – AspectOfTheNoob Feb 06 '23 at 00:01
43

if you are using c++11, here is a simple wrapper (see this gist):

#include <iostream>
#include <chrono>

class Timer
{
public:
    Timer() : beg_(clock_::now()) {}
    void reset() { beg_ = clock_::now(); }
    double elapsed() const { 
        return std::chrono::duration_cast<second_>
            (clock_::now() - beg_).count(); }

private:
    typedef std::chrono::high_resolution_clock clock_;
    typedef std::chrono::duration<double, std::ratio<1> > second_;
    std::chrono::time_point<clock_> beg_;
};

Or for c++03 on *nix:

#include <iostream>
#include <ctime>

class Timer
{
public:
    Timer() { clock_gettime(CLOCK_REALTIME, &beg_); }

    double elapsed() {
        clock_gettime(CLOCK_REALTIME, &end_);
        return end_.tv_sec - beg_.tv_sec +
            (end_.tv_nsec - beg_.tv_nsec) / 1000000000.;
    }

    void reset() { clock_gettime(CLOCK_REALTIME, &beg_); }

private:
    timespec beg_, end_;
};

Example of usage:

int main()
{
    Timer tmr;
    double t = tmr.elapsed();
    std::cout << t << std::endl;

    tmr.reset();
    t = tmr.elapsed();
    std::cout << t << std::endl;
    return 0;
}
gongzhitaao
  • 6,566
  • 3
  • 36
  • 44
  • 2
    Another option would be to use boost::chrono instead of the C++11 STL std::chrono namespace. Thank you for your code. – Didac Perez Parera Sep 18 '14 at 08:29
  • 1
    Careful: This won't work if the user changes his time between `Timer()` and the call to `elapsed()` if `!std::chrono::high_resolution_clock::is_steady` - which is the case on Linux! – jhasse Feb 09 '18 at 12:33
29

I would seriously consider the use of Boost, particularly boost::posix_time::ptime and boost::posix_time::time_duration (at http://www.boost.org/doc/libs/1_38_0/doc/html/date_time/posix_time.html).

It's cross-platform, easy to use, and in my experience provides the highest level of time resolution an operating system provides. Possibly also very important; it provides some very nice IO operators.

To use it to calculate the difference in program execution (to microseconds; probably overkill), it would look something like this [browser written, not tested]:

ptime time_start(microsec_clock::local_time());
//... execution goes here ...
ptime time_end(microsec_clock::local_time());
time_duration duration(time_end - time_start);
cout << duration << '\n';
Jeremy CD
  • 597
  • 4
  • 10
  • 3
    But boost local_time() is _not_ monotonic so it should not be used to measure time lapses. I haven't found a way to access monotonic time from Boost. – gatopeich Oct 20 '11 at 11:01
13

boost 1.46.0 and up includes the Chrono library:

thread_clock class provides access to the real thread wall-clock, i.e. the real CPU-time clock of the calling thread. The thread relative current time can be obtained by calling thread_clock::now()

#include <boost/chrono/thread_clock.hpp>
{
...
    using namespace boost::chrono;
    thread_clock::time_point start = thread_clock::now();
    ...
    thread_clock::time_point stop = thread_clock::now();  
    std::cout << "duration: " << duration_cast<milliseconds>(stop - start).count() << " ms\n";
Gabi Davar
  • 959
  • 10
  • 13
12

In Windows: use GetTickCount

//GetTickCount defintition
#include <windows.h>
int main()
{

    DWORD dw1 = GetTickCount();

    //Do something 

    DWORD dw2 = GetTickCount();

    cout<<"Time difference is "<<(dw2-dw1)<<" milliSeconds"<<endl;

}
aJ.
  • 34,624
  • 22
  • 86
  • 128
  • If the program runs for a significant amount of time, beware rollover: DWORD is 32 bit unsigned, and will roll over from 0xFFFFFFFF to 0x00000000 at 2^32 msec (around 49.71 days). The pattern shown (currentTime - startTime) is correct (provided it is guaranteed that the time difference does not exceed 2^32-1 msec). – Technophile Feb 22 '21 at 21:46
  • 1
    I would never use GetTickCount because it's invalid after 50 days, use GetTickCount64 – Danil Jul 13 '21 at 09:22
8

You can also use the clock_gettime. This method can be used to measure:

  1. System wide real-time clock
  2. System wide monotonic clock
  3. Per Process CPU time
  4. Per process Thread CPU time

Code is as follows:

#include < time.h >
#include <iostream>
int main(){
  timespec ts_beg, ts_end;
  clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &ts_beg);
  clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &ts_end);
  std::cout << (ts_end.tv_sec - ts_beg.tv_sec) + (ts_end.tv_nsec - ts_beg.tv_nsec) / 1e9 << " sec";
}

`

Richard
  • 56,349
  • 34
  • 180
  • 251
7

For me, the most easy way is:

#include <boost/timer.hpp>

boost::timer t;
double duration;

t.restart();
/* DO SOMETHING HERE... */
duration = t.elapsed();

t.restart();
/* DO OTHER STUFF HERE... */
duration = t.elapsed();

using this piece of code you don't have to do the classic end - start.

Enjoy your favorite approach.

user1823890
  • 704
  • 8
  • 7
6

just in case you are on Unix, you can use time to get the execution time:

$ g++ myprog.cpp -o myprog
$ time ./myprog
fengshaun
  • 2,100
  • 1
  • 16
  • 25
5

If you are using:

tstart = clock();

// ...do something...

tend = clock();

Then you will need the following to get time in seconds:

time = (tend - tstart) / (double) CLOCKS_PER_SEC;
dvhh
  • 4,724
  • 27
  • 33
Robert White
  • 125
  • 1
  • 5
5

Just a side note: if you're running on Windows, and you really really need precision, you can use QueryPerformanceCounter. It gives you time in (potentially) nanoseconds.

v3.
  • 2,003
  • 15
  • 18
4

Get the system time in milliseconds at the beginning, and again at the end, and subtract.

To get the number of milliseconds since 1970 in POSIX you would write:

struct timeval tv;

gettimeofday(&tv, NULL);
return ((((unsigned long long)tv.tv_sec) * 1000) +
        (((unsigned long long)tv.tv_usec) / 1000));

To get the number of milliseconds since 1601 on Windows you would write:

SYSTEMTIME systime;
FILETIME filetime;

GetSystemTime(&systime);
if (!SystemTimeToFileTime(&systime, &filetime))
    return 0;

unsigned long long ns_since_1601;
ULARGE_INTEGER* ptr = (ULARGE_INTEGER*)&ns_since_1601;

// copy the result into the ULARGE_INTEGER; this is actually
// copying the result into the ns_since_1601 unsigned long long.
ptr->u.LowPart = filetime.dwLowDateTime;
ptr->u.HighPart = filetime.dwHighDateTime;

// Compute the number of milliseconds since 1601; we have to
// divide by 10,000, since the current value is the number of 100ns
// intervals since 1601, not ms.
return (ns_since_1601 / 10000);

If you cared to normalize the Windows answer so that it also returned the number of milliseconds since 1970, then you would have to adjust your answer by 11644473600000 milliseconds. But that isn't necessary if all you care about is the elapsed time.

Jared Oberhaus
  • 14,547
  • 4
  • 56
  • 55
1

Here is a sample with function!

#include <chrono>
#include <iostream>

int time_dif_in_seconds(std::chrono::system_clock::time_point start, std::chrono::system_clock::time_point end) {
    auto duration = std::chrono::duration_cast<std::chrono::seconds>(end - start);
    return duration.count();
}

int main() {
    auto start = std::chrono::high_resolution_clock::now();
    auto end = std::chrono::high_resolution_clock::now();
    int dif = time_dif_in_seconds(start, end);
    std::cout << std::to_string(dif) << " seconds" << std::endl;
    return 0;
}
  • What's with `_V2`? Looks non-portable. – HolyBlackCat Mar 24 '23 at 17:18
  • I don't know what's it but I compile on Ubuntu and Windows successfully using g++ in both OS. – Ivan Sansão Mar 24 '23 at 18:04
  • 1
    I wonder where you got the idea to use it, if you don't know what it is. Getting the same results on the same compiler on different OSes isn't that surprising. You need to test other C++ standard libraries: [clang's libc++](https://gcc.godbolt.org/z/Yf4nhafbb) and [MSVC STL](https://gcc.godbolt.org/z/GPq76cx5a). You should probably just remove `::_V2`, it doesn't seem to affect the result. – HolyBlackCat Mar 24 '23 at 18:33
0

This seems to work fine for intel Mac 10.7:

#include <time.h>

time_t start = time(NULL);


    //Do your work


time_t end = time(NULL);
std::cout<<"Execution Time: "<< (double)(end-start)<<" Seconds"<<std::endl;
Miek
  • 1,127
  • 4
  • 20
  • 35