3

I'm using tensorflow to do a gradient decent classification.

train_op = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)

here cost is the cost function that I have used in optimization. After launching the Graph in the Session, the Graph can be fed as:

sess.run(train_op, feed_dict)

And with this, all the variables in the cost function will be updated in order to minimized the cost.

Here is my question. How can I update only some variables in the cost function when training..? Is there a way to convert created variables into constants or something..?

Ramesh-X
  • 4,853
  • 6
  • 46
  • 67
  • If you defined your own cost function, you can hard-write the variables that you want constant, and not update them. I don't know if you see what I mean. – CoMartel Aug 17 '16 at 06:35
  • 3
    You can give a list of variables into `GradientDescentOptimizer.minimize()` as `var_list` (also see https://www.tensorflow.org/versions/r0.10/api_docs/python/train.html#usage and https://www.tensorflow.org/versions/r0.10/api_docs/python/train.html#Optimizer.minimize), does that do what you want? – fwalch Aug 17 '16 at 07:37
  • 2
    See http://stackoverflow.com/questions/35298326/freeze-some-variables-scopes-in-tensorflow-stop-gradient-vs-passing-variables?rq=1 – jeandut Aug 17 '16 at 07:37
  • @HarryPotfleur: I'm using a network that someone else defined and tuned and I'm going to fine tune it by adding more layers.. @fwalch, @jean: I didn't know how to use the `var_list` argument. Thanks for the links..! – Ramesh-X Aug 17 '16 at 08:40
  • 1
    Possible duplicate of ["freeze" some variables/scopes in tensorflow: stop\_gradient vs passing variables to minimize](https://stackoverflow.com/questions/35298326/freeze-some-variables-scopes-in-tensorflow-stop-gradient-vs-passing-variables) – almightyGOSU Dec 06 '17 at 10:04
  • Possible duplicate of [Holding variables constant during optimizer](https://stackoverflow.com/questions/34477889/holding-variables-constant-during-optimizer) – Ramesh-X Feb 21 '18 at 09:43

1 Answers1

4

There are several good answers, this subject should already be closed: stackoverflow Quora

Just to avoid another click for people getting here :

The minimize function of the tensorflow optimizer takes a var_list argument for that purpose:

first_train_vars = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES,
                                     "scope/prefix/for/first/vars")
first_train_op = optimizer.minimize(cost, var_list=first_train_vars)

second_train_vars = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES,
                                      "scope/prefix/for/second/vars")                     
second_train_op = optimizer.minimize(cost, var_list=second_train_vars)

I took it as is from mrry

To get the list of the names you should use instead of "scope/prefix/for/second/vars" you can use :

tf.get_default_graph().get_collection_ref(tf.GraphKeys.TRAINABLE_VARIABLES)