0

I'm beginner of deep learning and numpy and trying to implement numpy codes in this tutorial < https://medium.com/@14prakash/back-propagation-is-very-simple-who-made-it-complicated-97b794c97e5c > process for studying backpropagtion and numpy code.

There is a derivative of softmax formula. enter image description here

I'm trying to implement derivative of softmax in numpy above the picture. I find that derivative of softmax with cross entropy loss function is very clear and clean. But I would like to implement code only derivative of Softmax. Is there any useful Numpy function which can help implement that formula?? Thanks.

SongDoHou
  • 31
  • 7
  • Yeah, [`np.exp`](https://docs.scipy.org/doc/numpy/reference/generated/numpy.exp.html), is that what you're looking for? – Szymon Maszke Apr 14 '19 at 17:32
  • Thanks, but I cannot think about generalize that formula to make a function of derivative of softmax :) – SongDoHou Apr 14 '19 at 17:36
  • 1
    `I find that derivative of softmax with cross entropy loss function is very clear and clean.` - I thought you did find it clear conceptually. If you are looking for softmax derivative implementation in `numpy` there has been a ton of questions like this, e.g. [here](https://stackoverflow.com/a/40576872). – Szymon Maszke Apr 14 '19 at 17:48
  • 1
    Thank You. It was my calculation mistakes. Thank you for your help. – SongDoHou Apr 15 '19 at 12:55

0 Answers0