1

I am trying to optimize a xgboost tree by using feature selection with caret's genetic algorithm

results <- gafs(iris[,1:4], iris[,5],
               iters = 2,
               method = "xgbTree",
               metric = "Accuracy",
               gafsControl = gafsControl(functions=caretGA, method="cv", repeats=2, verbose = TRUE),
               trConrol = trainControl(method = "cv", classProbs = TRUE, verboseIter = TRUE)
               )

this is however very slow and thus I would like to plot a progress bar. However I do not get any progress plotted, even though I used verbose = TRUE and verboseIter = TRUE. What am I doing wrong?

Make42
  • 12,236
  • 24
  • 79
  • 155
  • `xgbTree` is just taking forever. Eventually you would see `Fold01 ....`, but that's all, no plotted progress. – Julius Vainora Jan 18 '19 at 14:37
  • @JuliusVainora: I thought the point of xgboost is that is much faster in training than other boosted methods. Have I been mislead? What other method should I take instead? Other methods also took quite a long time. For now I just want to demonstrate "gafs" - what method should I use instead? – Make42 Jan 19 '19 at 21:46
  • The default one, random forest, seems to be faster. – Julius Vainora Jan 19 '19 at 21:57
  • @JuliusVainora: This sounds like the opposite of what https://stackoverflow.com/a/32946722/4533188 says. – Make42 Jan 19 '19 at 22:17
  • Well, I simply got to see `Fold01 ....` much earlier with random forest than `xgbTree`. And I'm not sure what's opposite: it says that ranger > randomForest, xgboost. – Julius Vainora Jan 19 '19 at 22:27

0 Answers0