6

I have been playing around with neural networks for quite a while now, and recently came across the terms "freezing" & "unfreezing" the layers before training a neural network while reading about transfer learning & am struggling with understanding their usage.

  • When is one supposed to use freezing/unfreezing?
  • Which layers are to freezed/unfreezed? For instance, when I import a pre-trained model & train it on my data, is my entire neural-net except the output layer freezed?
  • How do I determine if I need to unfreeze?
  • If so how do I determine which layers to unfreeze & train to improve model performance?
desertnaut
  • 57,590
  • 26
  • 140
  • 166
Nizam
  • 340
  • 1
  • 6
  • 11

3 Answers3

6

I would just add to the other answer that this is most commonly used with CNNs and the amount of layers that you want to freeze (not train) is "given" by the amount of similarity between the task that you are solving and the original one (the one that the original network is solving).

If the tasks are very similar, let's say that you are using CNN pretrained on imagenet and you just want to add some other "general" objects that the network should recognize then you might get away with training just the dense top of the network.

The more dissimilar the tasks are, the more layers of the original network you will need to unfreeze during the training.

Matus Dubrava
  • 13,637
  • 2
  • 38
  • 54
4

By freezing it means that the layer will not be trained. So, its weights will not be changed.

Why do we need to freeze such layers?

Sometimes we want to have deep enough NN, but we don't have enough time to train it. That's why use pretrained models that already have usefull weights. The good practice is to freeze layers from top to bottom. For examle, you can freeze 10 first layers or etc.


For instance, when I import a pre-trained model & train it on my data, is my entire neural-net except the output layer freezed?
- Yes, that's may be a case. But you can also don't freeze a few layers above the last one.

How do I freeze and unfreeze layers?
- In keras if you want to freeze layers use: layer.trainable = False
And to unfreeze: layer.trainable = True

If so how do I determine which layers to unfreeze & train to improve model performance?
- As I said, the good practice is from top to bottom. You should tune the number of frozen layers by yourself. But take into account that the more unfrozen layers you have, the slower is training.

Yoskutik
  • 1,859
  • 2
  • 17
  • 43
  • I think the question "How do I determine if I need to unfreeze?" Isn't about the technical solution but instead it's a question like "How do I know that I should unfreeze a frozen layer?". Thus a question about strategies and indicators that help to make the decision to unfreeze a layer. – BeWu Jun 06 '20 at 08:34
0

When training a model while transfer layer, we freeze training of certain layers due to multiple reasons, such as they might have already converged or we want to train the newly added layers to an already pre-trained models. This is a really basic concept of Transfer learning and I suggest you go through this article if you have no idea about transfer learning .

maria_g
  • 130
  • 1
  • 6