3

I have to train my model using K-fold cross validation but at the same time I want to use Early stopping to prevent overfitting. How can it be done ? Since Early stopping will return a different model in each fold, does the average of the accuracies of the folds mean anything ?

Sandeep Pandey
  • 364
  • 4
  • 17
  • 1
    Not recommended; see own answer in [Early stopping with Keras and sklearn GridSearchCV cross-validation](https://stackoverflow.com/questions/48127550/early-stopping-with-keras-and-sklearn-gridsearchcv-cross-validation/48139341#48139341) – desertnaut Oct 09 '19 at 08:03

1 Answers1

3

Even when you do not use Early Stopping, every time you use Cross-Validation you have a different model in each fold: the model has different parameters and different results, but that's the point of CV. You can use ES without any particular attention.

Matteo Felici
  • 1,037
  • 10
  • 19