I am trying to make a calculator in Python that gives you the amount of time it would take for light to travel X miles, but I'm having some division problems.
When I input "1" or any small number, it gives me a larger number than it should. For example, I input "5" and it tells me "It would take 2.684096876850903e-05 seconds for light to travel 5.0 miles.", even though it only takes 0.00002684096877 seconds. What's odd is the calculator actually works with large numbers. (For example, I input milesPerSecond and it gives me "1", like it should.)
Is there something I am doing wrong? Thanks.
variable = 1
while variable == 1:
input1 = input("How many miles do you want light to travel? ");
miles = float(input1);
milesPerSecond = 186282.397;
seconds = miles / milesPerSecond;
if seconds < 60:
print("It would take", seconds, "seconds for light to travel", miles, "miles.");
elif seconds < 3600:
print("It would take", seconds / 60, "minutes for light to travel", miles, "miles.");
elif seconds < 86400:
print("It would take", seconds / 3600, "hours for light to travel", miles, "miles.");
elif seconds > 86400:
print("It would take", seconds / 86400, "days for light to trabel", miles, "miles.");
print("");