New to python but with some experience with c# and trying to create a basic script for calculating the standard deviation of a set of randomly generated integers. The two histograms generated in the code below are different, though they should be the same to my knowledge as I never modified the variable 'incomes' between generating the two histograms.
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
npoints = 10000
incomes = np.random.normal(100.0, 50.0, npoints)
################
plt.hist(incomes, 50)
plt.show()
###############
incomestore = incomes
meanincomes = np.mean(incomes)
for i in range(incomes.size):
incomestore[i] = (incomes[i] - meanincomes)**(2)
standardDeviation = np.sqrt(np.mean(incomestore))
print(standardDeviation)
###############
plt.hist(incomes, 50)
plt.show()
##############
The two histograms generated by the code are different when they should be the same (not a scaling error).