2

I want to train the YOLO v8 in transfer learning on my custom dataset.
I have different classes than the base training on the COCO dataset.

Yet I don't want to learn again the feature extraction. Hence I though following the Ultralytics YOLOv8 Docs - Train.

Yet, When I train on my small dataset I want to freeze the backbone.

How can I do that?

I looked at the documentation and couldn't find how to do so.

Christoph Rackwitz
  • 11,317
  • 4
  • 27
  • 36
David
  • 135
  • 6
  • 1
    [This](https://docs.ultralytics.com/yolov5/tutorials/transfer_learning_with_frozen_layers/) might help – Seon Jun 28 '23 at 08:10
  • @Seon, It does to some degree. Though it seems to be cli oriented. I wonder how it works with `v8`. – David Jun 28 '23 at 08:23

1 Answers1

3

You can do the following

def freeze_layer(trainer):
    model = trainer.model
    num_freeze = 10
    print(f"Freezing {num_freeze} layers")
    freeze = [f'model.{x}.' for x in range(num_freeze)]  # layers to freeze 
    for k, v in model.named_parameters(): 
        v.requires_grad = True  # train all layers 
        if any(x in k for x in freeze): 
            print(f'freezing {k}') 
            v.requires_grad = False 
    print(f"{num_freeze} layers are freezed.")

Then add this function as a custom callback function to the model

model = YOLO("yolov8x.pt") 
model.add_callback("on_train_start", freeze_layer)
model.train(data="./dataset.yaml")

Original answer is provided in one of the issues in ultralytics repo Freezing layers yolov8 #793