print ("This program reads an unsigned binary number of arbitrary length
\nand tries to compute the decimal equivalent. The program reports an error
\nif the input number contains characters other than 0 and 1.")
dec = 0
bin = 0
factor = 1
print ("Enter Binary Number:")
bin = int(input())
while(bin > 0):
if((bin % 10) == True):
dec += factor
bin /= 10
factor = factor * 2
print ("I think your binary number in decimal form is: " , dec)
This is my first post on this website. This is the code for my program, which is supposed to convert a binary number entered by the user into a decimal number. For some reason, this code works fine in Python2 and outputs the correct decimal value, but either outputs a 1 or a 2 in Python3 which isn't correct. Does anyone know what's up with this code? I've tried many things but nothing seems to work. (By the way I am a beginner).