2

I'm working on a new optimizer, and I managed to work out most of the process. Only thing I'm stuck on currently is finding gen_training_ops.

Apparently this file is crucial, because in both implementations of Gradient Descent, and Adagrad optimizers they use functions that are imported out of a wrapper file for gen_training_ops (training_ops.py in the python/training folder). I can't find this file anywhere, so I suppose I don't understand something and search in the wrong place. Where can I find it? (Or specifically the implementations of apply_adagrad and apply_gradient_descent)

Thanks a lot :)

martianwars
  • 6,380
  • 5
  • 35
  • 44
Iluha Bratan
  • 71
  • 1
  • 3

2 Answers2

1

The file will be generated when you build TensorFlow from source. It is marked as out in this file,

https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/BUILD#L912

martianwars
  • 6,380
  • 5
  • 35
  • 44
1

If you find it, youll realize it just jumps to pyhon/framework, where the actual update is just an assign operation and then gets grouped

mshlis
  • 172
  • 6