如何在GPU上运行预测? [英] How to run a prediction on GPU?

查看:178
本文介绍了如何在GPU上运行预测?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用h2o4gpu,我设置的参数是

I am using h2o4gpu and the parameters which i have set are

h2o4gpu.solvers.xgboost.RandomForestClassifier model.

XGBClassifier(base_score=0.5, booster='gbtree', colsample_bylevel=1,
colsample_bytree=1.0, gamma=0, learning_rate=0.1, max_delta_step=0,
max_depth=8, min_child_weight=1, missing=nan, n_estimators=100,
n_gpus=1, n_jobs=-1, nthread=None, num_parallel_tree=1, num_round=1,
objective='binary:logistic', predictor='gpu_predictor',
random_state=123, reg_alpha=0, reg_lambda=1, scale_pos_weight=1,
seed=None, silent=False, subsample=1.0, tree_method='gpu_hist')

当我训练此模型然后进行预测时,GPU上的一切运行正常.

When i am training this model and then predicting, everything is running fine on GPU.

但是,当我将模型保存在pickle中,然后重新加载到另一个笔记本中,然后通过predict_proba在其上运行预测时,一切都在CPU上运行.

However, when i am saving my model in pickle and then loading back into another notebook and then running a prediction through predict_proba on it, then everything is running on CPU.

为什么我的预测不能在GPU上运行?

Why is my prediction not running on GPU?

推荐答案

这些预测旨在在CPU上运行,因此您不需要GPU即可实际使用该模型.

The predictions are meant to run on CPU so you don't need a GPU to actually use the model.

这篇关于如何在GPU上运行预测?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆