I was writing this code to get random float numbers with 2 decimals at most when I realised there were some weird numbers in the console.
public void PrintNumbers()
{
for(int i = 0; i < 30; i++)
{
int num = Random.Range(1, 200);
float x = num * 0.1f;
print(x);
}
}
Then I realized that the number was always 8.900001 so I wrote this piece code:
public void PrintWeirdNumber()
{
float x = 89 * 0.1f;
print(x);
}
This method always prints 8.900001. I already found an easy solution, wich is just to divide the intenger by 10.0f instead of multiplying by 0.1f.
public void PrintNumber()
{
float x = 89 / 10.0f;
print(x);
}
This will always print 8.9
Why is this happening? I just don't understand float number enough to know the reason behind this. Any help?