0

I know that backpropagation calculates the derivative of the cost function with respect to the parameters of the model (weight and bias). However, I need to make sure that backpropagation does not update weight and bias; instead, it uses OPTIMIZERs in order to update weights and bias like Adam, Gradient Descent, and others

Thanks in advance

Faref
  • 53
  • 8

1 Answers1

1

If I understand well your question: when you use an optimizer in a deep learning framework (PyTorch/TensorFlow), say "Adam", the weights updating are being performed by the optimizer. This process is taking place automatically, you do not need to manually write any code, the framework does the updating weights+biases for you.

Timbus Calin
  • 13,809
  • 5
  • 41
  • 59
  • Is every backpropagation step uses the optimizer algorithm to update weights? Does backpropagation have no ability to update weights on its own? – Faref Oct 29 '20 at 09:09
  • When you talk about the "plain back propagation" you are actually talking about the gradient descent + back propagation + SGD (stochastic gradient descent optimizer). So practically, yes, but bear in mind it's not only the backpropagation but backpropagation+optimizer, where the latter decides the way in which the weights are updated – Timbus Calin Oct 29 '20 at 09:50
  • Optimizer can be Adam, SGD, AdaDelta,RMSProp... etc etc – Timbus Calin Oct 29 '20 at 09:50