The execution times for i += 1
and i = i + 1
are different. For example, given the following snippet:
snip = """
x = 0
for i in range(10000):
x += i
"""
snip2 = """x=0
for i in range(10000):
x = x + i
"""
First run it once:
print timeit.timeit(snip, number=1)
print timeit.timeit(snip2, number=1)
0.000745058059692
0.000648021697998
and x = x + i
is faster than x += i
, run it a thousand times:
print timeit.timeit(snip, number=1000)
print timeit.timeit(snip2, number=1000)
0.663557052612
0.661299943924
then both result in nearly the same time, off by few microseconds, and run it a million times:
print timeit.timeit(snip, number=1000000)
print timeit.timeit(snip2, number=1000000)
771.912870884
778.865171909
then it clearly results in the opposite of case one; x += i
is cheaper now and the other is way costlier.
What is the difference in execution of those two statements? What is Big O time for both?
This question originates from a situation where my server has statements i += 1
and billions of DB transactions will end in dead lock, but with i = i + 1
it works great.