0

I have a big net with many layers. I add a new full-connected layer in the net and want to do a fine-tuning. However, it's so difficult to set lr_mult: 0 in every layer except the new one, since there are many layers in the net.
If there is a good way to solve these problem?

Thanks.

Shai
  • 111,146
  • 38
  • 238
  • 371
nannanmath
  • 11
  • 5

1 Answers1

4

How about, instead of setting lr_mult: 0 for all parameters to all layers prior to the new fully connected layer, just stop the back propagation after the new layer?

You can do that by setting propagate_down: false.
For example:

layer {
  name: "new_layer"
  type: "InnerProduct"
  ...
  inner_product_param {
    ...
  }
  propagate_down: false # do not continue backprop after this layer
}

Alternatively, you can use sed, a command line utility, to directly change all entries in your prototxt file:

~$ sed -i -E 's/lr_mult *: *[0-9]+/lr_mult: 0/g' train_val.prototxt

This one line will change all lr_mult in your train_val.prototxt to zero. You'll only need to manually set the lr_mult for the new layer.

Shai
  • 111,146
  • 38
  • 238
  • 371
  • Thank you for your answer. I Think `sed` may be more proper for me. By the way, I have another question that is If the layer designed by me has two bottom and I want only one of them can back propagation, how shoud I design the layer and set net.prototxt? Thanks. – nannanmath Jan 07 '16 at 05:39
  • @nannanmath please post as new question – Shai Jan 07 '16 at 05:57