I'm beginner of deep learning and numpy and trying to implement numpy codes in this tutorial < https://medium.com/@14prakash/back-propagation-is-very-simple-who-made-it-complicated-97b794c97e5c > process for studying backpropagtion and numpy code.
There is a derivative of softmax formula.
I'm trying to implement derivative of softmax in numpy above the picture. I find that derivative of softmax with cross entropy loss function is very clear and clean. But I would like to implement code only derivative of Softmax. Is there any useful Numpy function which can help implement that formula?? Thanks.