I am new to machine learning and learning how to implement the softmax in python, I was following the below thread
I was doing some analysis and say if we have a array
batch = np.asarray([[1000,2000,3000,6000],[2000,4000,5000,6000],[1000,2000,3000,6000]])
batch1 = np.asarray([[1,2,2,6000],[2,5,5,3],[3,5,2,1]])
and try to implement softmax (as mentioned in the link above) via:
1) Shared by Pab Torre:
np.exp(z) / np.sum(np.exp(z), axis=1, keepdims=True)
2) Asked in initial question:
e_x = np.exp(x - np.max(x))
return e_x / e_x.sum()
With both of these I am getting errors (value out of bound), so I kind a use the normalization and try to run it
x= np.mean(batch1)
y = np.std(batch1)
e_x = np.exp((batch1 - x)/y)
j = e_x / e_x.sum(axis = 0)
So my questions to all, is this the way I can implement? If not how can I handle the above cases?
Thanks in advance