使用numpy进行多元多项式回归 [英] Multivariate polynomial regression with numpy

查看:857
本文介绍了使用numpy进行多元多项式回归的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有很多样本(y_i, (a_i, b_i, c_i)),其中y可能会作为a,b,c中的多项式在一定程度上发生变化.例如,对于给定的数据集和2级,我可能会生成模型

I have many samples (y_i, (a_i, b_i, c_i)) where y is presumed to vary as a polynomial in a,b,c up to a certain degree. For example for a given set of data and degree 2 I might produce the model

y = a^2 + 2ab - 3cb + c^2 +.5ac

这可以使用最小二乘法完成,并且是numpy的polyfit例程的略微扩展. Python生态系统中的某处是否有标准实现?

This can be done using least squares and is a slight extension of numpy's polyfit routine. Is there a standard implementation somewhere in the Python ecosystem?

推荐答案

sklearn提供了一种简单的方法.

sklearn provides a simple way to do this.

此处上发布示例:

#X is the independent variable (bivariate in this case)
X = array([[0.44, 0.68], [0.99, 0.23]])

#vector is the dependent data
vector = [109.85, 155.72]

#predict is an independent variable for which we'd like to predict the value
predict= [0.49, 0.18]

#generate a model of polynomial features
poly = PolynomialFeatures(degree=2)

#transform the x data for proper fitting (for single variable type it returns,[1,x,x**2])
X_ = poly.fit_transform(X)

#transform the prediction to fit the model type
predict_ = poly.fit_transform(predict)

#here we can remove polynomial orders we don't want
#for instance I'm removing the `x` component
X_ = np.delete(X_,(1),axis=1)
predict_ = np.delete(predict_,(1),axis=1)

#generate the regression object
clf = linear_model.LinearRegression()
#preform the actual regression
clf.fit(X_, vector)

print("X_ = ",X_)
print("predict_ = ",predict_)
print("Prediction = ",clf.predict(predict_))

以下是输出:

>>> X_ =  [[ 0.44    0.68    0.1936  0.2992  0.4624]
>>>  [ 0.99    0.23    0.9801  0.2277  0.0529]]
>>> predict_ =  [[ 0.49    0.18    0.2401  0.0882  0.0324]]
>>> Prediction =  [ 126.84247142]

这篇关于使用numpy进行多元多项式回归的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆