0

I am in the process of implementing a Quasi-Newton optimizer for tensorflow, and my question is when Optimizer apply_gradients function is called inside of the minimize function, are the gradients applied at whatever values the tensors happen to have at that moment in time?

Cheers, Sergey

1 Answers1

0

The apply gradients function can be run at any time and will update the current weights in the network. You can change all of the gradients yourself to all ones and watch the weights increase by one.

You can see from the docs or from git itself here

Steven
  • 5,134
  • 2
  • 27
  • 38
  • In terms of a mathematical formulation the net is treated as a function of the weights then, not a function of the input data? So the gradient is computed symbolically and applied to the current value of the weights? –  Mar 12 '17 at 02:30
  • For proper computation of the gradients though it is necessarily a function of the inputs. So when you run opt.minimize(cost). It calls opt.compute_gradients(cost) which determines the input at each stage and returns to you the gradients based on the inputs. However the apply_gradients function does not actually care about how the gradients were obtained it just needs to ensure that they are the proper shape for the given computation graph. You can read here about tensorflow symbolic gradients. http://stackoverflow.com/questions/36370129/does-tensorflow-use-automatic-or-symbolic-gradients – Steven Mar 12 '17 at 02:42
  • Fantastic! Thanks! –  Mar 12 '17 at 02:53