XGBoost 如何进行并行计算? [英] How does XGBoost do parallel computation?

查看:215
本文介绍了XGBoost 如何进行并行计算?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

XGBoost 使用加法训练的方法,它对先前模型的残差进行建模.

XGBoost uses the method of additive training in which it models the residual of the previous model.

虽然这是顺序的,那么它如何并行计算呢?

This is sequential though, how does it to parallel computing then?

推荐答案

Xgboost 不像您提到的那样并行运行多个树,您需要在每棵树之后进行预测以更新梯度.

Xgboost doesn't run multiple trees in parallel like you noted, you need predictions after each tree to update gradients.

相反,它在单个树中进行并行化,我使用 openMP 独立创建分支.

Rather it does the parallelization WITHIN a single tree my using openMP to create branches independently.

为了观察这一点,构建一个巨大的数据集并以 n_rounds=1 运行.你会看到你所有的核心都在一棵树上燃烧.这就是它如此快速的原因 - 设计精良.

To observe this,build a giant dataset and run with n_rounds=1. You will see all your cores firing on one tree. This is why it's so fast- well engineered.

这篇关于XGBoost 如何进行并行计算?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆