That is my first attempt to use StopWatch to meassure code performance and I don't know what is wrong. I want to check if there is difference when casting to double to calculate average with integers.
public static double Avarage(int a, int b)
{
return (a + b + 0.0) / 2;
}
public static double AvarageDouble(int s, int d)
{
return (double)(s + d) / 2;
}
public static double AvarageDouble2(int x, int v)
{
return ((double)x + v) / 2;
}
Code to test these 3 methods, using StopWatch:
Stopwatch sw = new Stopwatch();
sw.Start();
for (int i = 0; i < 1000000; i++)
{
var ret = Avarage(2, 3);
}
sw.Stop();
Console.Write("Using 0.0: " + sw.ElapsedTicks + "\n");
sw.Reset();
sw.Start();
for (int i = 0; i < 1000000; i++)
{
var ret2 = AvarageDouble(2, 3);
}
sw.Stop();
Console.Write("Using Double(s+d): " + sw.ElapsedTicks + "\n");
sw.Reset();
sw.Start();
for (int i = 0; i < 1000000; i++)
{
var ret3 = AvarageDouble2(2, 3);
}
sw.Stop();
Console.Write("Using double (x): " + sw.ElapsedTicks + "\n");
It shows random result, once Average is the fastets, other time AverageDouble or AverageDouble2. I use diff variable names, but looks like it does not matter.
What am I missing?
PS. What is the best method to calculate average with two ints as inputs?