0

I have a cost function of the following form as part of the computational graph:

cost = term_1 - alpha * term_2

I want to dynamically anneal the value of alpha during training but I cannot find a straightforward way to do it. Do you have any suggestions?

Thanks

AutomEng
  • 425
  • 2
  • 9
  • 19
  • is `alpha` a tensor variable? – Dirk Nachbar Aug 22 '17 at 10:24
  • Yes, there's a part in my code where I try to assign a new value to alpha, every 500 steps. It looks like this: `if i % 500 == 0: st_alpha /= 2 sess.run(alpha.assign(st_alpha)) print("alpha:", temp)` but I get a `RuntimeError: Graph is finalized and cannot be modified.` – AutomEng Aug 22 '17 at 10:39

1 Answers1

0

You can create a placeholder for the alpha. Your problem is similar to setting an adaptive learning rate, so check out this: How to set adaptive learning rate for GradientDescentOptimizer?

Dmytro Prylipko
  • 4,762
  • 2
  • 25
  • 44