I know that I can use BigDecimal.divide()
method to set a fixed precision:
BigDecimal a1 = new BigDecimal(1);
BigDecimal a2 = new BigDecimal(3);
BigDecimal division = a1.divide(a2,5,1); //equals 0.33333
But if the division result is exact:
BigDecimal a1 = new BigDecimal(1);
BigDecimal a2 = new BigDecimal(4);
BigDecimal division = a1.divide(a2,5,1); //equals 0.25000
How can I set the division result so if the division ends with an exact result, it will just give the output of 0.25 instead of 0.25000?
I've also tried to not specifying a precision by doing:
BigDecimal division = a1.divide(a2);
it succeeds in giving the result 0.25
when doing 1/4
, but when it does division like 1/3
or 2/3
it results in a runtime ArithmeticException
.
a1
and a2
are instantiated from user input.
Is there any way to solve this?