Python xgboost:内核死了 [英] Python xgboost: kernel died

查看:72
本文介绍了Python xgboost:内核死了的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的 Jupyter 笔记本的 Python 内核一直在消亡.我之前已经成功运行了以下所有代码.目前,有问题.首先,我将向您展示我能够成功运行的代码块:

My Jupyter notebook's python kernel keeps dying. I have run all of the following code successfully before. Presently, there are issues. First, I will show you the code chunk that I am able to run successfully:

import xgboost as xgb
xgtrain = xgb.DMatrix(data = X_train_sub.values, label = Y_train.values)       # create dense matrix of training values
xgtest  = xgb.DMatrix(data = X_test_sub.values,  label = Y_test.values)        # create dense matrix of test values
param   = {'max_depth':2, 'eta':1, 'silent':1, 'objective':'binary:logistic'}  # specify parameters via map

我的数据很少的地方:

X_train_imp_sub.shape
(1365, 18)

然而,我的笔记本内核一直在这个块上死掉:

however, my notebook's kernel keeps dying on this chunk:

xgmodel = xgb.train(param,  xgtrain, num_boost_round = 2)                      # train the model
predictions = xgmodel.predict(xgtest)                                          # make prediction
from sklearn.metrics import accuracy_score                                   
accuracy = accuracy_score(y_true = Y_test, 
                          y_pred = predictions.round(), 
                          normalize = True) # If False, return # of correctly classified samples. Else, return fraction of correctly classified samples
print("Accuracy: %.2f%%" % (accuracy * 100.0))

当我将其分解并逐行运行时,内核似乎死在了 xgb.train() 行上.

When I break it apart and run line-by-line, the kernel appears to die on the xgb.train() line.

数据很小.xgboost 参数应该是保守的(即 num_boost_round = 2max_depth:2eta:1 并且计算量不大.不确定会发生什么

The data is small. The xgboost parameters should be conservative (i.e. num_boost_round = 2, max_depth:2, eta:1 and not computationally expensive. Not sure what is going on.

如前所述,我之前已经能够成功运行这两个块.我已经关闭了所有其他笔记本并重新启动了我的电脑,但没有运气.我正在 Macbook Pro 上通过 Anaconda Navigator 启动 jupyter.

As stated before, I have been able to run both chunks successfully before. I have shut down all other notebooks and restarted my computer without luck. I am launching jupyter through Anaconda Navigator on a Macbook Pro.

-- 更新--当我在 xgboost 训练单元下选择一个单元,然后选择:Cells --> Run All Above,内核总是死在xgboost 训练线.这连续发生了大约 40-50 次.我尝试了很多次,因为我正在对代码进行更改,以为我会在以后解决 xgboost 问题.

-- UPDATE -- When I selected a cell beneath my xgboost training cell, then selected: Cells --> Run All Above, the kernel would always die on the xgboost training line. This happened ~40-50 times in a row. I tried that many times because I was making changes to the code, thinking I would resolve the xgboost issue later.

后来,我逐个运行相同的单元格,并且 xgboost 在我第一次尝试时以及之后的每次尝试时都运行良好.我不知道为什么会发生这种情况,但很高兴知道.

Later on, I ran the same cells one-by-one and the xgboost completed fine on the first time I tried this and every time after. I do not know why this happens but it would be nice to know.

推荐答案

我遇到了类似的问题.这为我修好了.

I was having a similar problem. This fixed it for me.

import os
os.environ['KMP_DUPLICATE_LIB_OK']='True'
from xgboost import XGBClassifier

这篇关于Python xgboost:内核死了的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆