I'm porting a program originally written in VB.NET to Java. I'm reading from a file which stores 32 bit floats in little endian order.
The original program does this:
Dim br As BinaryReader = ...
Dim f_vb As Single = br.ReadSingle
Java is big endian, so I reverse the bytes before converting to a float.
RandomAccessFile raf = // ...
int i = raf.readInt();
int bigEndian = Integer.reverseBytes(i);
float f_java = Float.intBitsToFloat(bigEndian);
As far as I can tell, f_vb
and f_java
contain the same bits. That is, BitConverter.ToInt32
on f_vb
and Float.floatToIntBits
(and floatToRawIntBits
) on f_java
give the same thing. However, the floats are not equal. For example, let bigEndian == 0x4969F52F
. Java will report 958290.94
, and VB.NET will report 958290.938
. I'm guessing this comes from a difference in the way the JVM and CLR handle floating point numbers, but I don't know enough about floating point issues to figure out why. This loss of precision is causing trouble down the line, so I'd like to identify the source.