This questions is based on:
xgboost in R: how does xgb.cv pass the optimal parameters into xgb.train
I am trying to loop similarly, with different parameters:
for (iter in 1:100){
param <- list(objective = "binary:logistic",
eval_metric = "auc",
max_depth = sample(2:6,1),
eta = runif(.01,.1,.05),
gamma = runif(.01,.05,.1),
subsample = runif(.9,.8,.7),
colsample_bytree = runif(.8,.9,.5),
min_child_weight = sample(30:100,1),
max_delta_step = sample(1:10,1)
)
But it is throwing error because param is taking values on first iteration as:
max_depth : int 6
eta : num(0)
gamma: num(0)
subsample : num(0)
colsample_bytree : num(0)
min_child_weight: int 63
max_delta_step: int 2
what might be causing this behaviour?