0

I have a dataset composed of 79 numerical variables used as predictors, and 20 numerical variables used as response variables, the total number of columns is 99. it has 140 observations of each variable(140 rows). is there a method to cross-validate this dataset in caret package? I use nnet package as my model

I searched a lot for a method but only found this code in r documetation:

  train(x, y, method = "rf", preProcess = NULL, ...,
  weights = NULL, metric = ifelse(is.factor(y), "Accuracy", "RMSE"),
  maximize = ifelse(metric %in% c("RMSE", "logLoss", "MAE"), FALSE, TRUE),
  trControl = trainControl(), tuneGrid = NULL,
  tuneLength = ifelse(trControl$method == "none", 1, 3))

this is the default method, but I don't know how to specify my predictor variables (the first 79 columns) and my response variables (the remaining 20 columns)!!

is there a way to do that ?!!

I followed this link Applying k-fold Cross Validation model using caret package

however, it didn't answer my question which is: how to tell the train function which column is a predictor and which column is a response?

Adham
  • 31
  • 7
  • 1
    https://stackoverflow.com/questions/33470373/applying-k-fold-cross-validation-model-using-caret-package – Hack-R Jul 09 '18 at 14:33
  • 1
    Possible duplicate of [Applying k-fold Cross Validation model using caret package](https://stackoverflow.com/questions/33470373/applying-k-fold-cross-validation-model-using-caret-package) – phiver Jul 09 '18 at 15:03
  • thank you, I checked this link but it didn't answer my question !! – Adham Jul 10 '18 at 02:27

0 Answers0