岭二项式回归在Python中可用吗? [英] Is ridge binomial regression available in Python?

查看:114
本文介绍了岭二项式回归在Python中可用吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是Python的新手,我想适应岭二项式回归. 我知道可以在以下位置获得二项式回归: http://statsmodels.sourceforge.net/devel/glm.html

I am new to Python and I would like to fit a ridge binomial regression. I know that binomial regression is available at: http://statsmodels.sourceforge.net/devel/glm.html

我也知道具有L2惩罚的逻辑回归可以与sklearn.linear_model拟合.

I also know that logistic regression with L2 penalty can be fitted with sklearn.linear_model.

http://scikit-learn.org/stable/modules/generation/sklearn.linear_model.LogisticRegression.html

由于二项式是伯努利的总和,所以我可以通过将其二项式结构化数据转换为伯努利结构的第i ^行来使用scikit:

As binomial is sum of Bernoulli, I could use scikit after transforming my binomial structured data into Bernoulli structure, by changing its i^th row:

(size_i,success_i)

(size_i, success_i)

转换为一个长度为size_i的向量,其中记录了success_i 1和size_i-success_i 0. 但是,这对我不起作用,因为size_i很大.

into a vector of length size_i recording success_i 1 and size_i - success_i 0. However, this does not work for me as size_i is very large.

是否有使用Python拟合二项式岭回归的方法?

Is there way to fit Binomial ridge regression using Python?

推荐答案

statsmodels GLM还没有完整的惩罚支持,但是现在它在master中具有弹性净L1/L2惩罚. 它尚未包含在在线文档中

statsmodels GLM does not have full penalization support yet, however it has now elastic net L1/L2 penalization in master. It is not yet included in the online documentation

https://github.com/statsmodels/statsmodels/blob/master/statsmodels/genmod/generalized_linear_model.py#L1007

GLM(...).fit_regularized(alpha=..., L1_wt=0)只适合L2 Ridge惩罚.

GLM(...).fit_regularized(alpha=..., L1_wt=0) would fit just a L2 Ridge penality.

警告:由于此功能刚刚被合并并且尚未被大量使用,因此仍被认为是实验性的.它应该产生正确的结果,但是API和实现很可能仍会更改.

Warning: Because this is a feature that has just been merged and has not seen heavy usage yet, it is still considered experimental. It should produce the correct results but API and implementation will most likely still change.

这篇关于岭二项式回归在Python中可用吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆