0

Very basis question, i am trying to initialize a Big Decimal with value of 1.1, when i print it the output is not exactly 1.1

BigDecimal bd1 = new BigDecimal(1.1);
System.out.println(bd1);

The output is

1.100000000000000088817841970012523233890533447265625

it works when if I use "1.1" (as string), or if i try to print bd1.doubleValue()

Can someone please explain this behavior, and what are those additional decimal digits auto generated.

Adarsh
  • 1
  • 2
    Very important: always check the Javadoc first. The Javadoc for `BigDecimal(double)` answers your question exactly. https://docs.oracle.com/javase/8/docs/api/java/math/BigDecimal.html#BigDecimal-double- – Erwin Bolwidt Jul 13 '19 at 03:32

1 Answers1

1

It's because the number you're initializing it with - 1.1 - is a float! And as a float it can't represent 0.1 exactly - when represented as a binary number 0.1 has an infinitely repeating tail. Look up how floats are represented and that'll be a canonical example.

Initialize some other way, e.g., with 11 then divide by 10 - you'll get what you want.

Here is one stack overflow answer among many that explains this property of floats.

davidbak
  • 5,775
  • 3
  • 34
  • 50