0

I need a way to find out the amount of time taken by a function, and a section of my code inside a function, to execute
Does Visual Studio provide any mechanism for doing this, or is it possible to do so from the program using MFC functions? I am new to MFC so I am not sure how this can be done. I thought this should be a pretty straight forward operation but I cannot find any examples on how this may be done either

user13267
  • 6,871
  • 28
  • 80
  • 138

2 Answers2

3

A quick way, but quite imprecise, is by using GetTickCount():

DWORD time1 = GetTickCount();

// Code to profile

DWORD time2 = GetTickCount();

DWORD timeElapsed = time2-time1;

The problem is GetTickCount() uses the system timer, which has a typical resolution of 10 - 15 ms. So it is only useful with long computations.

It can't tell the difference between a function that takes 2 ms to run and one that takes 9 ms. But if you are in the seconds range, it may well be enough.

If you need more resolution, you can use the performance counter, as RedEye explains.

Or you can try a profiler (maybe this is what you were looking for?). See this question.

Community
  • 1
  • 1
MikMik
  • 3,426
  • 2
  • 23
  • 41
2

There may be better ways, but I do it like this :

// At the start of the function
LARGE_INTEGER lStart;
QueryPerformanceCounter(&lStart);
LARGE_INTEGER lFreq;
QueryPerformanceFrequency(&lFreq);

// At the en of the function
LARGE_INTEGER lEnd;
QueryPerformanceCounter(&lEnd);
TRACE("FunctionName t = %dms\n", (1000*(lEnd.LowPart - lStart.LowPart))/lFreq.LowPart);

I use this method quite a lot for optimising graphics code, finding time taken for screen updates etc. There are other methods of doing the same or similar, but this is quick and simple.

Redeye
  • 1,582
  • 1
  • 14
  • 22
  • This is ... strange. Usage of `TRACE` implies that you are running your code inside the debugger, and AFAIC, measuring timings when running in the debugger isn't a really great idea in general. – Martin Ba Jul 09 '13 at 13:43
  • Why not ? If you want to improve your code and you manage to get your code run twice as fast in debug mode, it will probably also run twice as fast un release mode. – Jabberwocky Jul 09 '13 at 13:58
  • AFAIK the timings are perfectly accurate. The times measured probably won't be the same as in release mode but as Michael says, if you can improve performance in debug mode, it usually follows in release mode too. If you want to get timings in release mode, just log the output in a different way. – Redeye Jul 09 '13 at 14:03
  • @MichaelWalz - (1) I was talking about running *inside the debugger* which does not imply running in the debug mode. (2) As you correctly observed, `TRACE` will only be enabled in debug mode. And (3) **No**, measuring speed-ups in debug mode **does not** imply anything for release mode. – Martin Ba Jul 09 '13 at 14:04
  • @MichaelWalz, Redeye - There are quite a few cases where code changes that lead to performance gains in Debug mode have *no* measurable impact in release mode *at all*. Sometimes (less common) even the opposite effect. Ditto for running under the debugger. I do not doubt that there are indeed cases where the gains are proportional, but you can't generalize. (Since you mention graphics code -- maybe the debug mode has little impact in your gfx code so you get away with it.) I'd say roughly 50:50 ... in 50% of the cases the debug measurements are OK, in 50% they are complete bogus. Hence: Don't. – Martin Ba Jul 09 '13 at 14:08