Below is my code. Here i am trying to increment values by 0.1 and after 0.5 it make a whole value and the increments goes. but while running the code i am getting decimal beyond added value.
double value1 = 1.0;
double minorvalue = 0.1;
int count = 0;
double output = 0;
for (int i = 0; i < 1000; i++)
{
double _over;
if (count < 5)
{
_over = output + minorvalue;
output = _over;
Console.WriteLine(output);
count++;
}
else
{
_over = (output + value1) - 0.5;
output = _over;
Console.WriteLine(output);
minorvalue = 0.1;
count = 0;
}
}
Console.ReadLine();
And the output is like below
0.1
0.2
0.30000000000000004
0.4
0.5
1
1.1
1.2000000000000002
1.3000000000000003
1.4000000000000004
1.5000000000000004
2.0000000000000004
2.1000000000000005
2.2000000000000006
2.3000000000000007
2.400000000000001
2.500000000000001
3.000000000000001
3.100000000000001
3.200000000000001
3.300000000000001
3.4000000000000012
3.5000000000000013
4.000000000000002
4.100000000000001
4.200000000000001
4.300000000000001
4.4
4.5
5
5.1
5.199999999999999
5.299999999999999
5.399999999999999
5.499999999999998
5.999999999999998
6.099999999999998
6.1999999999999975
6.299999999999997
6.399999999999997
6.4999999999999964
don't know what's wrong with this incremental gives value like this 2.3000000000000003 and others 10.299999999999999