1

Could someone explain to me what methods are used Keras (Theano backend) when training (fit. function) convolutional neural networks, if the activation function is RELU? The backpropagation method can not be used because the RELU activation function is not differentiable at 0.

Blauharley
  • 4,186
  • 6
  • 28
  • 47
  • 1
    Why not use the documentation of keras? – bigbounty Jan 06 '18 at 10:33
  • Backpropagation is still used with ReLUs, it doesn't seem to matter that in theory the ReLU is not differentiable at zero. This is one point where theory and practice disagree. – Dr. Snoopy Jan 06 '18 at 12:07

0 Answers0