I'm doing a porting of a matlab project in C#, this snippet of matlab code fills griglia with decimal like 0.025, 0.075, 0.325
for xgriglia=x1+incxy/2:incxy:x2
for ygriglia=y1+incxy/2:incxy:y2
contagriglia=contagriglia+1;
griglia(contagriglia,1)=xgriglia;
griglia(contagriglia,2)=ygriglia;
end
end
I translated it in C# like so, (variables are all double)
for (var xgriglia = x1 + incxy / 2; xgriglia <= x2; xgriglia += incxy)
{
for (var ygriglia = y1 + incxy / 2; ygriglia <= y2; ygriglia += incxy)
{
contagriglia++;
griglia0[contagriglia - 1] = xgriglia;
griglia1[contagriglia - 1] = ygriglia;
}
}
but the results in griglia0 and gliglia1 are like: 0.075000000000000011, 0.32499999999999996, 0.37499999999999994
what is the cause of this? And how can I fix it?