The code
I have been testing out Decimal in Python3, and I have come across some strange things that did not make any sense to me.
First of all, I imported Decimal
from decimal import *
Next, I set to what accuracy (in digits) I want any calculations to be
getcontext().prec = 50
Then, I defined and printed a variable called num
, which was equal to 0.6 recurring
num = Decimal(2/3)
However, when I try and print num
I get this
print(num)
0.66666666666666662965923251249478198587894439697265625
Also, changing to either of these:
getcontext().prec = 500
getcontext().prec = 3
Changes nothing, even as 3 it gives the same output
My main two questions
So theres two things that I don't understand with this
- Why the random and incorrect digits after the first
0.6666666666666666
? I was expecting it to say0.6
with as many6's
as the number defined in thegetcontext().prec
- Isn't
getcontext().prec = 3
supposed to make it 3 digits long? Because its still doing a lot more than that, andgetcontext().prec = 500
also doesn't make it anywhere near 500 digits long
Edit:
I am using Python3 on Windows