I am tunning the parameters of a XGBoost Regressor using a custom cross validation method. One of the parameters that I am using is the number of trees (n_estimators) and I am also using early_stopping_rounds so the training can stop.
The problem is that in the end, I have a different classifier for each fold during the cross validation. For example, suppose I am training using n_estimators=100 and early_stopping_rounds=20; in one fold I could have completed the training without early stopping but in the next iteration the training could have stopped at the 30th iteration, having n_estimators=30.
How should I proceed?