I'm trying to sum 5 numbers in two ways, but the result is different, how does it happen and what should I do about it? I need to use double type
int startIndex = i - (Lenght - 1);
Debug.WriteLine("By LINQ:");
Debug.WriteLine(company_data.Close[startIndex]);
for (int j = 1; j < Lenght; j++)
{
Debug.WriteLine(company_data.Close[startIndex + j]);
}
double sum = deviations.GetRange(startIndex, Lenght).Sum();
Debug.WriteLine("\nSUM = " + sum);
in output i have
By LINQ:
25.51
25.585
25.68
25.5975
25.56
SUM = 127.9325
and i have another way to calculate the same:
double sum = 0.0;
Debug.WriteLine("By Method:");
for (int i = index; i > index - lenght; i--)
{
Debug.WriteLine(source[i]);
Debug.WriteLine("sum before = " + sum);
sum += deviations[i];
Debug.WriteLine("sum after = " + sum);
}
Debug.WriteLine("sum = " + sum);
and this i have in output now:
By Method:
25.56
sum before = 0
sum after = 25.56
25.5975
sum before = 25.56
sum after = 51.1575
25.68
sum before = 51.1575
sum after = 76.8375
25.585
sum before = 76.8375
sum after = 102.42250000000001
25.51
sum before = 102.42250000000001
sum after = 127.93250000000002
sum = 127.93250000000002
where did this '0.00000000000001' value come from?