0

I want to write a function in C language such that it must check millions of parameters and if all of them are true, function returns true as well, otherwise, it returns false.

However, estimating the time of this operation is important, meaning that we need to know it takes how many milliseconds. (An approximate time would be enough.) We need to know this time to know the throughput of this function.

Note: This parameters are read locally from a file and we use an ordinary computers.

Questioner
  • 662
  • 1
  • 10
  • 26

1 Answers1

1

Rather than estimating the time, measure it. Modern CPU architecture performs optimizations so complex that a simple change in the data ordering could increase the running time of your function by a factor of six or more.

In your case it is very important to run a realistic benchmark: all parameters that you check need to be placed in memory at the same positions as in the actual program, and your code should be checking them in the same order. This way you would see the effect of caching. Since your function is all-or-nothing, branch prediction would have almost no effect on your code, because prediction would fail at most once before the loop exits.

Since you are reading your parameters from a file, it is important to use the same API in the test program as you plan to use in the actual program. Depending on the platform, I/O APIs may exhibit significant difference in performance, so your test code should test what you plan to use in production.

Sergey Kalinichenko
  • 714,442
  • 84
  • 1,110
  • 1,523