In fastai during training, the validation loss and evaluation metric is calculated every epoch and best epoch is saved if we use the SaveModelCallback() callback. However we could increase the frequency of this process and evaluate the metric after every n steps (eg. your batch size: 32, 64, etc) in order to better capture the moment where the model starts to over-fit. This is very easily usable in repos like detectron2 via the BestCheckpointer class. Any ideas on how to implement such a callback in fastai? for reference this was discussed in this forum but no solutions were yielded
Asked
Active
Viewed 132 times