For counting number of zero's in a number ..
def cal(l):
r = 0
o = 0
while (r == 0):
r = l % 10
o += 1
l = int(l / 10)
return o-1
So when the input is greater than 10^23 output is 1. Why does this happen?