I have measured the running time of two equivalent functions that calculates mean of a numpy's array (1D and 2D) using timeit
module:
>>> setup = 'import numpy as np;a=np.random.randint(100, size=(100,100));b=np.random.randint(100, size=1000)'
>>> timeit.timeit(setup=setup, stmt='a.mean()')
13.513522000001103
>>> timeit.timeit(setup=setup, stmt='a.sum()/a.size')
6.080089200000657
>>> timeit.timeit(setup=setup, stmt='b.mean()')
5.404982399999426
>>> timeit.timeit(setup=setup, stmt='b.sum()/b.size')
2.261378399998648
Surprisingly, numpy.ndarray.mean
method is slower than numpy.ndarray.sum()/numpy.ndarray.size
regardless of the size of the array.
Can anybody explain this?Thanks in advance!