0

This questions is based on:

xgboost in R: how does xgb.cv pass the optimal parameters into xgb.train

I am trying to loop similarly, with different parameters:

for (iter in 1:100){

param <- list(objective = "binary:logistic",
              eval_metric = "auc",
              max_depth = sample(2:6,1),
              eta = runif(.01,.1,.05),
              gamma = runif(.01,.05,.1), 
              subsample = runif(.9,.8,.7),
              colsample_bytree = runif(.8,.9,.5), 
              min_child_weight = sample(30:100,1),
              max_delta_step = sample(1:10,1)
)

But it is throwing error because param is taking values on first iteration as:

max_depth : int 6
eta : num(0)
gamma: num(0)
subsample : num(0)
colsample_bytree : num(0)
min_child_weight: int 63
max_delta_step: int 2

what might be causing this behaviour?

Community
  • 1
  • 1
muni
  • 1,263
  • 4
  • 22
  • 31

1 Answers1

0

Seems like i had got the runif function wrong.

This seems to be working:

param <- list(objective = "binary:logistic",
              eval_metric = "auc",
              max_depth = sample(2:6,1),
              eta = runif(1,.01,.05),
              gamma = runif(1,.01,.1), 
              subsample = runif(1,0.6,0.9),
              colsample_bytree = runif(1,.5,1), 
              min_child_weight = sample(30:100,1),
              max_delta_step = sample(1:10,1)
muni
  • 1,263
  • 4
  • 22
  • 31