1

Firstly, I'm a bit new to Python, I know this floating point arithmetic seems very basic but I can't find any duplicate/related question in SO

I have an acceptance test: expect 3.3 / 3 to be 1.1

Then I tried..

from decimal import *
>>> Decimal(3.3) / Decimal(3)
Decimal('1.099999999999999940788105353')

>>> Decimal(3.3) / Decimal(3.0)
Decimal('1.099999999999999940788105353')

>>> Decimal('3.3') / Decimal('3')
Decimal('1.1') # as expected

Question: What is the best practice to use Python decimal in predictable ways? or that I just need to format every decimal display to string?

To be more specific: I'm writing a small automation script for loan data report.

aifarfa
  • 3,939
  • 2
  • 23
  • 35
  • Possible duplicate of [Limiting floats to two decimal points](http://stackoverflow.com/questions/455612/limiting-floats-to-two-decimal-points) – Peter Wood Nov 23 '15 at 10:25
  • Shouldn't you use integers for money? Or maybe [**`fraction`**](https://docs.python.org/2/library/fractions.html). – Peter Wood Nov 23 '15 at 10:25

2 Answers2

5

The point is that in the bare float 3.3 to Decimal you're already subject to floating-point imprecision:

>>> Decimal(3.3)
Decimal('3.29999999999999982236431605997495353221893310546875')

So, yes, you should always pass strings.

Daniel Roseman
  • 588,541
  • 66
  • 880
  • 895
3

Looking at https://docs.python.org/2/library/decimal.html, it is possible to set the precision for your operations. The default is 28 decimal points.

from decimal import *
getcontext().prec = 2
Decimal(3.3) / Decimal(3)

This returns "Decimal('1.1')"

Mirec Miskuf
  • 1,695
  • 1
  • 14
  • 13
  • 1
    To get a predictable precision on input, use strings. To get predictable precision on output, use context's prec value. – Mirec Miskuf Nov 23 '15 at 10:30