I was trying to get the mean and standard deviation for log-normal distribution, where mu=0.4104857306 and sigma=3.4070874277012617, and I am expecting mean=500 and std=600. I am unsure what I have done wrong. Here are the code:
import scipy.stats as stats
import numpy as np
a = 3.4070874277012617
b = 0.4104857306
c = stats.lognorm.mean(a,b)
d = stats.lognorm.var(a,b)
e = np.sqrt(d)
print("Mean =",c)
print("std =",e)
And the outputs are here:
Mean = 332.07447304207903
sd = 110000.50047821256
Thank you in advance.
Edit:
Thank you for your help. I have checked and found out there were some calculation mistake. I can now get the mean=500 but still cannot get std=600. Here is the code that I have used:
import numpy as np
import math
from scipy import exp
from scipy.optimize import fsolve
def f(z):
mean = 500
std = 600
sigma = z[0]
mu = z[1]
f = np.zeros(2)
f[0] = exp(mu + (sigma**2) / 2) - mean
f[1] = exp(2*mu + sigma**2) * exp(sigma**2 - 1) - std**2
return f
z = fsolve (f,[1.1681794012855686,5.5322865416282365])
print("sigma =",z[0])
print("mu =",z[1])
print(f(z))
sigma = 1.1681794012855686
mu = 5.5322865416282365
I have tried to check the result with my calculator, and I can get std=600 as required, I still get 853.5698320847896
with lognorm.std(sigma, scale=np.exp(mu))
.