For each resampling (driven by trainControl()), it is running a fit. This is what you're seeing with the 10 of 10 epochs cycling continuously. Each cycle was a resample/fold being fit. You can change the number of epochs used while you're hyperparameter tuning by setting the epochs argument to train, which will be passed to the training method "mlpKerasDropout" via dot args (...)
See the code for mlpKerasDropout here: https://github.com/topepo/caret/blob/master/models/files/mlpKerasDropout.R
By default, the search argument for hyperparameters is set to 'grid' but you may want to set it to 'random' so that it tries different activation functions other than relu, or provide your own tuning grid.
Here's a code sample showing the usage of tuneLength with search='random', and utilizing early stopping as well as epochs arguments passed to keras.
tune_model <- train(x, y,
method = "mlpKerasDropout",
preProc = c('center', 'scale', 'spatialSign'),
trControl = trainControl(search = 'random', classProbs = T,
summaryFunction = mnLogLoss, allowParallel = TRUE),
metric = 'logLoss',
tuneLength = 20,
# keras arguments following
validation_split = 0.25,
callbacks = list(
keras::callback_early_stopping(monitor = "val_loss", mode = "auto",
patience = 20, restore_best_weights = TRUE)
),
epochs = 500)
Keep in mind you want re-fit your model on the training data after completing the hyperparameter tuning CV.