I am working in Unity3D with C#
and I get a weird result. Can anyone tell me why my code equals 0?
float A = 1 / 90;
I am working in Unity3D with C#
and I get a weird result. Can anyone tell me why my code equals 0?
float A = 1 / 90;
The literals 1
and 90
are interpreted as an int
. So integer division is used. After that the result is converted to a float
.
In general C# will read all sequences (without decimal dot) of digits as an int
. An int
will be converted to a float
if necessary. But before the assignment, that's not necessary. So all calculations in between are done as int
s.
In other words, what you've written is:
float A = (float) ((int) 1)/((int) 90)
(made it explicit here, this is more or less what the compiler reads).
Now a division of two int's is processed such that it takes only the integral part into account. The integral part of 0.011111
is 0
thus zero.
If you however modify one of the literals to a floating point (1f
, 1.0f
, 90f
,...) or both, this will work. Thus use one of these:
float A = 1/90.0f;
float A = 1.0f/90;
float A = 1.0f/90.0f;
In that case, floating point division will be performed. Which takes into account both parts.
etc.