如何在Python sklearn中添加交互项 [英] How to add interaction term in Python sklearn

查看:52
本文介绍了如何在Python sklearn中添加交互项的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如果我有自变量[x1,x2,x3]如果我在sklearn中拟合线性回归它会给我这样的东西:

If I have independent variables [x1, x2, x3] If I fit linear regression in sklearn it will give me something like this:

y = a*x1 + b*x2 + c*x3 + intercept

poly =2 的多项式回归会给我类似的东西

Polynomial regression with poly =2 will give me something like

y = a*x1^2 + b*x1*x2 ......

我不想拥有像x1 ^ 2这样的二级学位.

I don't want to have terms with second degree like x1^2.

我怎么获得

y = a*x1 + b*x2 + c*x3 + d*x1*x2

如果x1和x2具有大于某个阈值j的高相关性.

if x1 and x2 have high correlation larger than some threshold value j .

推荐答案

为生成多项式特征,我假设您正在使用

For generating polynomial features, I assume you are using sklearn.preprocessing.PolynomialFeatures

该方法中有一个参数仅考虑交互作用.因此,您可以编写如下内容:

There's an argument in the method for considering only the interactions. So, you can write something like:

poly = PolynomialFeatures(interaction_only=True,include_bias = False)
poly.fit_transform(X)

现在,仅考虑您的互动条件,而省略更高的学历.您的新功能空间将变为[x1,x2,x3,x1 * x2,x1 * x3,x2 * x3]

Now only your interaction terms are considered and higher degrees are omitted. Your new feature space becomes [x1,x2,x3,x1*x2,x1*x3,x2*x3]

您可以在此基础上拟合回归模型

You can fit your regression model on top of that

clf = linear_model.LinearRegression()
clf.fit(X, y)

计算结果方程 y = a * x1 + b * x2 + c * x3 + d * x1 * x + e * x2 * x3 + f * x3 * x1

注意::如果您具有较高的维特征空间,则将导致

Note: If you have high dimensional feature space, then this would lead to curse of dimensionality which might cause problems like overfitting/high variance

这篇关于如何在Python sklearn中添加交互项的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆