I am simulating flipping 999 coins 1000 times, and draw a distribution of sample mean, which may take a long time (about 21 seconds). Is there a better way to do this? a faster way to run for loop, for instance. will vectorizing be useful?
import datetime
import numpy as np
sample_mean_dis = []
start_time = datetime.datetime.now()
# to draw a distribution of sample mean
for i in range(1000):
if not (i%100):
print('iterate: ', i)
sums_1000coins = []
# simulate 1k repetition of experiment_1
# and consider this opertation as a sample
# and compute the sample mean
for i in range(1000):
# this is simulating experiment_1 which flip 999 coins
# and sum heads
coins = np.random.randint(2, size=999)
sums_1000coins.append(np.sum(1 == coins))
sample_mean_dis.append(np.mean(sums_1000coins))
end_time = datetime.datetime.now()
elapsedTime = end_time - start_time
print("Elapsed time: %d seconds" % (elapsedTime.total_seconds()))