This question been asked several times before in different variations, but I still don't quite get it.
I need to find a way to trigger a while loop at exactly (or very very closely) to when a millisecond changes.
The idea is that the first while loop will wait for that period of time - the time between milliseconds, and once a millisecond changes (or very very close to such change), start doing whatever needed (at the moment count the amount of iterations it can do in 1 MS).
My question is, is my time calculation correct about the MS change and loop timings?
If so, why does the output is not consistent? or at least outputs close numbers? (because in theory, the loop should run for approx. The same amount of time therefore the approx. same amount of iterations should occur), and if so maybe the reason it outputs such numbers is because the process "looses CPU" in runtime?
I don't mind doing this in c or c++ if needed
This is my current code (using c#):
{
int incerments_per_ms = 0;
const long ticks_per_ms = 10000; //10000 ticks per ms
var start = DateTime.Now.Ticks;//get current time in ticks since jan 1st 2000 00:00
while ((DateTime.Now.Ticks - start) <= nano_seconds_in_ms) { }
start += ticks_per_ms;//raise 1 ms
while ((DateTime.Now.Ticks - start) <= nano_seconds_in_ms)
{
incerments_per_ms++;
}
Console.WriteLine(incerments_per_ms.ToString());
if (min == 0)
min = incerments_per_ms;
if (min > incerments_per_ms)
min = incerments_per_ms;
if (max < incerments_per_ms)
max = incerments_per_ms;
sum += incerments_per_ms;
}
int avg = sum / iterations;
Console.WriteLine("minimum is: " + min.ToString());
Console.WriteLine("maximum is: " + max.ToString());
Console.WriteLine("average is: " + avg.ToString() + "\n\n\n");
Output for 50 iterations:
0
6582
6601
6509
5248
6423
6710
4901
6499
6187
6426
6573
6545
6450
6567
4786
6582
6919
7018
6393
5990
6432
6084
5589
5396
6357
6578
6577
6557
7182
5137
6472
6543
6321
6533
6956
6811
2846
6269
5739
6307
5740
3673
5609
5440
5857
6561
4379
6026
6162