0

I try to freeze some layers of AlexNet as self.FROZEN_LAYER=['conv2', 'conv3']. Here's the snippet:

for op_name in weights_dict:

    # Check if layer should be trained from scratch
    if op_name not in self.SKIP_LAYER:

        with tf.variable_scope(op_name, reuse=True):

            # Assign weights/biases to their corresponding tf variable
            for data in weights_dict[op_name]:
                if len(data.shape) == 1:
                    var = tf.get_variable('biases',
                                          trainable=[True if op_name not in self.FROZEN_LAYER else False][
                                              0])  # todo: trainable
                    session.run(var.assign(data))

                # Weights
                else:
                    var = tf.get_variable('weights',
                                          trainable=[True if op_name not in self.FROZEN_LAYER else False][0])
                    session.run(var.assign(data))

But when I debug in the tf.get_variable() function (op_name: 'conv2' or 'conv3' in debugger), the trainable argument cannot be set to False. Does anyone know where the problem is?

Y Cheng
  • 21
  • 7

1 Answers1

0

A same problem has been raised in Is it possible to make a trainable variable not trainable?. And the first answer proposed in this issue worked for my question.

Y Cheng
  • 21
  • 7