I have a list of floats (actually it's a pandas Series object, if it changes anything) which looks like this:
mySeries:
...
22 16.0
23 14.0
24 12.0
25 10.0
26 3.1
...
(So elements of this Series are on the right, indices on the left.) Then I'm trying to assign the elements from this Series as keys in a dictionary, and indices as values, like this:
{ mySeries[i]: i for i in mySeries.index }
and I'm getting pretty much what I wanted, except that...
{ 6400.0: 0, 66.0: 13, 3.1000000000000001: 23, 133.0: 10, ... }
Why has 3.1
suddenly changed into 3.1000000000000001
? I guess this has something to do with the way the floating point numbers are represented (?) but why does it happen now and how do I avoid/fix it?
EDIT: Please feel free to suggest a better title for this question if it's inaccurate.
EDIT2: Ok, so it seems that it's the exact same number, just printed differently. Still, if I assign mySeries[26]
as a dictionary key and then I try to run:
myDict[mySeries[26]]
I get KeyError
. What's the best way to avoid it?