从scikit-学习逻辑回归模型计算残余偏差 [英] Calculate residual deviance from scikit-learn logistic regression model

查看:131
本文介绍了从scikit-学习逻辑回归模型计算残余偏差的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

有什么方法可以计算 scikit的残余偏差学习逻辑回归模型?这是R模型摘要的标准输出,但我在sklearn的任何文档中都找不到.

Is there any way to calculate residual deviance of a scikit-learn logistic regression model? This is a standard output from R model summaries, but I couldn't find it any of sklearn's documentation.

推荐答案

  1. 如@ russell-richie所建议,应为model.predict_proba
  2. 不要忘记函数metrics.log_loss()中的参数normalize=False返回每个样本损失的总和.
  1. As suggested by @russell-richie, it should be model.predict_proba
  2. Don't forget the argument normalize=False in function metrics.log_loss() to return the sum of the per-sample losses.

因此,要完成@ingo的答案,并获得sklearn.linear_model.LogisticRegression的模型偏差,可以计算:

So to complete @ingo's answer, to obtain the model deviance with sklearn.linear_model.LogisticRegression, you can compute:

def deviance(X, y, model):
    return 2*metrics.log_loss(y, model.predict_proba(X), normalize=False)

这篇关于从scikit-学习逻辑回归模型计算残余偏差的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆