0

I have a simple problem:

print 1 / 100

returns

0

and

print float(1 / 100)

returns

0.0

Why does this happen? Shouldn't it return 0.01? Thank you for any help.

Demandooda
  • 122
  • 1
  • 14

3 Answers3

2

Simple solution!

print 1 / float(100)

Your problem is that by default in Python 2 the division operator will do integer division (rounding down to integer). By making one of the operands a float, Python will divide in the expected way. You were almost there with float(1 / 100), however all this accomplishes is doing the integer division of 1 by 100, which equals zero, then converting zero to a floating point number.

This is a recognized issue in Python 2, fixed in Python 3. If you get tired of writing x / float(y) all the time, you can do from __future__ import division to make the division operator behave as in Python 3.

HugoMailhot
  • 1,275
  • 1
  • 10
  • 19
1

You are doing integer division. What you've written is, for all intents and purposes:

float(int(1)/int(10))

For example:

assert float(10/3) == 3.0  # True

You need to have at least one of them implicitly or explicitly a float. All of these are valid:

float(1.0/100.0)
float(1.0/100)
float(1/100.0)
float(float(1)/float(100))
etc...
Goodies
  • 4,439
  • 3
  • 31
  • 57
0

You are doing integer division, when expecting a decimal (float), you can assert an integer to be a float by placing a decimal at the end, i.e.

    print 1 / 100.
mmoo
  • 1
  • 2