8

I have a predefined code that creates a Tensorflow graph. The variables are contained in variable scopes and each has a predefined initializer. Is there any way to change the initializer of the variables?

example: The first graph defines

with tf.variable_scope('conv1')
    w = tf.get_variable('weights')

Later on I would like to modify variable and change the initializer to Xavier:

 with tf.variable_scope('conv1')
     tf.get_variable_scope().reuse_variable()
     w = tf.get_variable('weights',initializer=tf.contrib.layers.xavier_initializer(uniform=False))

However, when I reuse a variable, the initializer doesn't change. later on when I do initialize_all_variables() I get the default values and not Xavier How can I change the initializer of a variable? Thanks

Hooked
  • 84,485
  • 43
  • 192
  • 261
aarbelle
  • 1,023
  • 8
  • 17
  • 1
    As you want to share/reuse the variable, so there is only a single variable, which should also have a single initializer and changing the initializer doesn't seem to make sense conceptually for this case; probably that's why Tensorflow doesn't allow you to change it. Can you simply add the initializer to the first occurrence of tf.get_variable('weights') or tf.variable_scope('conv1') ? – Yao Zhang Jun 24 '16 at 17:43

1 Answers1

4

The problem is that initialization can't be changed on setting up reuse (the initialization is set during the first block).

So, just define it with xavier intialization during the first variable scope call. So the first call would be, then initialization of all variables with be correct:

with tf.variable_scope(name) as scope:
    kernel = tf.get_variable("W",
                             shape=kernel_shape, initializer=tf.contrib.layers.xavier_initializer_conv2d())
    # you could also just define your network layer 'now' using this kernel
    # ....
    # Which would need give you a model (rather just weights)

If you need to re-use the set of weights, the second call can get you a copy of it.

with tf.variable_scope(name, reuse=True) as scope:
    kernel = tf.get_variable("W")
    # you can now reuse the xavier initialized variable
    # ....
kingtorus
  • 953
  • 12
  • 15
  • 1
    I have run into the same issue as @aarbelle except the computational graph has already been defined in a TFHub module (so I can't define it initially with a custom initializer). Is it really not possible to change the initializer prior to running the global initialization Op in session context? – campellcl Feb 16 '19 at 21:12