I wrote a script in python, and it surprised me. Basically, it takes five 20 digit numbers, multiplies them and then raises them to the power of 3000. The timeit module is used to find the time required to compute the calculation. Well, when I run this script, it says it took 3*10^-7 seconds to compute it. It then generates a file, output.txt, but the script doesn't end until about 15 seconds later.
import timeit
outputFile = open("output.txt", "w")
start = timeit.default_timer()
x = (87459837581209463928*23745987364728194857*27385647593847564738*10293769154925693856*12345678901234567891)**3000
stop = timeit.default_timer()
time = stop-start
print "Time taken for the calculation was {} seconds".format(time)
outputFile.writelines(str(x))
outputFile.close()
y = raw_input("Press enter to exit.")
Does this mean that it actually takes a longer time to print a 280kb file than to perform the calculation?(I find it unlikely.)
If not so, then does python execute the calculation when the variable x is called upon? Will it execute the calculation every time the variable is calculated, or will it store the actual value in the variable?
I have just written another script, which confirms that it takes python 0.03 seconds to write the result to a .txt file. So, why does python execute the calculations later?