4

I have a sample code here.

data(agaricus.train, package='xgboost')
train <- agaricus.train
bst <- xgboost(data = train$data, label = train$label, max.depth = 2,
eta = 1, nthread = 2, nround = 2,objective = "binary:logistic")
xgb.dump(bst, 'xgb.model.dump', with.stats = TRUE)

After building the model, I print it out as

booster[0]
0:[f28<-1.00136e-05] yes=1,no=2,missing=1,gain=4000.53,cover=1628.25
    1:[f55<-1.00136e-05] yes=3,no=4,missing=3,gain=1158.21,cover=924.5
        3:leaf=1.71218,cover=812
        4:leaf=-1.70044,cover=112.5
    2:[f108<-1.00136e-05] yes=5,no=6,missing=5,gain=198.174,cover=703.75
        5:leaf=-1.94071,cover=690.5
        6:leaf=1.85965,cover=13.25
booster[1]
0:[f59<-1.00136e-05] yes=1,no=2,missing=1,gain=832.545,cover=788.852
    1:[f28<-1.00136e-05] yes=3,no=4,missing=3,gain=569.725,cover=768.39
        3:leaf=0.784718,cover=458.937
        4:leaf=-0.96853,cover=309.453
    2:leaf=-6.23624,cover=20.4624

I have questions:

  1. I understand that Gradient boost tree averages results from these trees with some weighted coefficients. How can I get those coefs?

  2. Just to clarify. The value predicted by the trees are leaf = x, isn't it?

Thank you.

  • 1
    [This](http://stackoverflow.com/questions/32950607/how-to-access-weighting-of-indiviual-decision-trees-in-xgboost) might be helpful. – Aramis7d Jan 06 '16 at 08:52
  • 1
    I think my answer in this thread can be helpful http://stackoverflow.com/questions/39858916/xgboost-how-to-get-probabilities-of-class-from-xgb-dump-multisoftprob-objecti/40632862#40632862 – Run2 Nov 16 '16 at 14:14

1 Answers1

3

Combined answer for Q1 and Q2:

The coefficient for all tree leaf scores for xgboost is 1. Simply sum all the leaf scores. Let the sum be S. Then apply logistic(2-class) function on it: Pr(label=1) = 1/(1+exp(-S))

I have verified this and used in production systems.