I'm facing a huge problem and I have no idea what's causing it and how to fix this. I'm creating a game where I draw objects in Java. For drawing I use linear interpolation which uses 'alpha' value to move the object with a specific time value in one frame.
Here is how the interpolation algorithm looks like:
draw.x = current.getX() * alpha + previous.getX() * (1.0f - alpha);
All of the above values are floats (alpha, current, prev and draw).
Let's assume, that we're not moving and our current.getX()
and previous.getX()
are 100.0f
floats.
Alpha value is random (it's not random, but it's a number in 0.0
- 1.0
range) - also float.
Let's take alpha = 0,4352343
draw.x = 100.f * 0.4352343 + 100.f * (1.0f - 0.4352343) == 43,52343 + 56,47657 == 100
which is perfect, cause we're not moving - it should be 100 (our previous and current positions are both 100).
This is how math works, no matter what value alpha
would be - it will always return 100
, right?
NO!
In C++ it does - it works perfectly, I have tested it for 30 minutes with an output of 10 million cases - it always returns 100.0
.
But in Java - it doesn't. For 80.000 cases there are ~400-500 cases where draw.x
(return value from our function) becomes 99.99999999999.
It's completly unacceptable cause when I draw, I need to use int
, so I have to cast, so it gives 99 value and you can guess what happens with the animation.
I have just one question - how is this possible, that above algorithm can give different values in Java and C++ and how to fix this in Java?