I'm working on a bit of code where I'm implementing half-even rounding. I've been using this post (How to round a number to n decimal places in Java) as a guide and am running tests using the code below on an online Java compiler:
// Java code
public static void main(String []args){
String precisionString = "#.##";
double localValue = 265.335;
DecimalFormat df = new DecimalFormat(precisionString);
df.setRoundingMode(RoundingMode.HALF_EVEN);
System.out.println(Double.parseDouble(df.format(localValue)));
}
It works fine for most test values, but for the value 265.335, rounding to 2 decimal places returns 265.33 when it should be 265.34.
I tested this out on an online C# compiler and got similar results (C# equivalent below), so I'm thinking this isn't a language bug but rather something to do with how the numbers are being read in binary. I'm not able to figure out exactly what/where the problem is though..
// C# code
public static void Main()
{
double a = 265.335;
var rounded = Math.Round(a, 2, MidpointRounding.ToEven);
Console.WriteLine(rounded);
}
Thoughts ?