xgboost xgb.dump 树系数 [英] xgboost xgb.dump tree coefficient
问题描述
我这里有一个示例代码.
I have a sample code here.
data(agaricus.train, package='xgboost')
train <- agaricus.train
bst <- xgboost(data = train$data, label = train$label, max.depth = 2,
eta = 1, nthread = 2, nround = 2,objective = "binary:logistic")
xgb.dump(bst, 'xgb.model.dump', with.stats = TRUE)
建立模型后,我将其打印为
After building the model, I print it out as
booster[0]
0:[f28<-1.00136e-05] yes=1,no=2,missing=1,gain=4000.53,cover=1628.25
1:[f55<-1.00136e-05] yes=3,no=4,missing=3,gain=1158.21,cover=924.5
3:leaf=1.71218,cover=812
4:leaf=-1.70044,cover=112.5
2:[f108<-1.00136e-05] yes=5,no=6,missing=5,gain=198.174,cover=703.75
5:leaf=-1.94071,cover=690.5
6:leaf=1.85965,cover=13.25
booster[1]
0:[f59<-1.00136e-05] yes=1,no=2,missing=1,gain=832.545,cover=788.852
1:[f28<-1.00136e-05] yes=3,no=4,missing=3,gain=569.725,cover=768.39
3:leaf=0.784718,cover=458.937
4:leaf=-0.96853,cover=309.453
2:leaf=-6.23624,cover=20.4624
我有问题:
我知道梯度提升树的平均值来自这些树,并带有一些加权系数.我怎样才能得到这些系数?
I understand that Gradient boost tree averages results from these trees with some weighted coefficients. How can I get those coefs?
只是为了澄清.树预测的值是leaf = x,不是吗?
Just to clarify. The value predicted by the trees are leaf = x, isn't it?
谢谢.
推荐答案
Q1 和 Q2 的组合答案:
Combined answer for Q1 and Q2:
xgboost 的所有树叶分数的系数为 1.只需将所有树叶分数相加即可.设总和为 S.然后对其应用逻辑(2级)函数:Pr(label=1) = 1/(1+exp(-S))
The coefficient for all tree leaf scores for xgboost is 1. Simply sum all the leaf scores. Let the sum be S. Then apply logistic(2-class) function on it: Pr(label=1) = 1/(1+exp(-S))
我已经验证了这一点并在生产系统中使用.
I have verified this and used in production systems.
这篇关于xgboost xgb.dump 树系数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!