如何使用 scipy.optimize.least_squares 计算标准偏差误差 [英] How to compute standard deviation errors with scipy.optimize.least_squares

查看:235
本文介绍了如何使用 scipy.optimize.least_squares 计算标准偏差误差的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我将拟合与 optimize.curve_fit 和 optimize.least_squares 进行了比较.使用曲线拟合,我将协方差矩阵 pcov 作为输出,我可以通过以下方式计算拟合变量的标准偏差误差:

I compare fitting with optimize.curve_fit and optimize.least_squares. With curve_fit I get the covariance matrix pcov as an output and I can calculate the standard deviation errors for my fitted variables by that:

perr = np.sqrt(np.diag(pcov))

如果我使用最小二乘法进行拟合,我不会得到任何协方差矩阵输出,并且我无法计算我的变量的标准偏差误差.

If I do the fitting with least_squares, I do not get any covariance matrix output and I am not able to calculate the standard deviation errors for my variables.

这是我的例子:

#import modules
import matplotlib
import numpy as np
import matplotlib.pyplot as plt
from scipy.optimize import curve_fit
from scipy.optimize import least_squares

noise = 0.5
N = 100
t = np.linspace(0, 4*np.pi, N)

# generate data
def generate_data(t, freq, amplitude, phase, offset, noise=0, n_outliers=0, random_state=0):
    #formula for data generation with noise and outliers
    y = np.sin(t * freq + phase) * amplitude + offset
    rnd = np.random.RandomState(random_state)
    error = noise * rnd.randn(t.size)
    outliers = rnd.randint(0, t.size, n_outliers)
    error[outliers] *= 10
    return y + error

#generate data
data = generate_data(t, 1, 3, 0.001, 0.5, noise, n_outliers=10)

#initial guesses
p0=np.ones(4)
x0=np.ones(4)

# create the function we want to fit
def my_sin(x, freq, amplitude, phase, offset):
    return np.sin(x * freq + phase) * amplitude + offset

# create the function we want to fit for least-square
def my_sin_lsq(x, t, y):
    # freq=x[0]
    # phase=x[1]
    # amplitude=x[2]
    # offset=x[3]
    return (np.sin(t*x[0]+x[2])*x[1]+ x[3]) - y

# now do the fit for curve_fit
fit = curve_fit(my_sin, t, data, p0=p0)
print 'Curve fit output:'+str(fit[0])

#now do the fit for least_square
res_lsq = least_squares(my_sin_lsq, x0, args=(t, data))
print 'Least_squares output:'+str(res_lsq.x)


# we'll use this to plot our first estimate. This might already be good enough for you
data_first_guess = my_sin(t, *p0)

#data_first_guess_lsq = x0[2]*np.sin(t*x0[0]+x0[1])+x0[3]
data_first_guess_lsq = my_sin(t, *x0)

# recreate the fitted curve using the optimized parameters
data_fit = my_sin(t, *fit[0])
data_fit_lsq = my_sin(t, *res_lsq.x)

#calculation of residuals
residuals = data - data_fit
residuals_lsq = data - data_fit_lsq
ss_res = np.sum(residuals**2)
ss_tot = np.sum((data-np.mean(data))**2)
ss_res_lsq = np.sum(residuals_lsq**2)
ss_tot_lsq = np.sum((data-np.mean(data))**2)

#R squared
r_squared = 1 - (ss_res/ss_tot)
r_squared_lsq = 1 - (ss_res_lsq/ss_tot_lsq)
print 'R squared curve_fit is:'+str(r_squared)
print 'R squared least_squares is:'+str(r_squared_lsq)

plt.figure()
plt.plot(t, data)
plt.title('curve_fit')
plt.plot(t, data_first_guess)
plt.plot(t, data_fit)
plt.plot(t, residuals)

plt.figure()
plt.plot(t, data)
plt.title('lsq')
plt.plot(t, data_first_guess_lsq)
plt.plot(t, data_fit_lsq)
plt.plot(t, residuals_lsq)

#error
perr = np.sqrt(np.diag(fit[1]))
print 'The standard deviation errors for curve_fit are:' +str(perr)

我将非常感谢任何帮助,最好的祝福

I would be very thankful for any help, best wishes

ps:我从这个来源得到了很多输入并使用了部分代码 Robust回归

ps: I got a lot of input from this source and used part of the code Robust regression

推荐答案

optimize.least_squares 的结果里面有一个参数叫做 jac.来自 文档:

The result of optimize.least_squares has a parameter inside of it called jac. From the documentation:

jac : ndarray,稀疏矩阵或 LinearOperator,形状 (m, n)

jac : ndarray, sparse matrix or LinearOperator, shape (m, n)

解决方案处的修正雅可比矩阵,即 J^T J 是成本函数 Hessian 的高斯-牛顿近似值.类型与算法使用的类型相同.

Modified Jacobian matrix at the solution, in the sense that J^T J is a Gauss-Newton approximation of the Hessian of the cost function. The type is the same as the one used by the algorithm.

这可用于使用以下公式估计参数的协方差矩阵:Sigma = (J'J)^-1.

This can be used to estimate the Covariance Matrix of the parameters using the following formula: Sigma = (J'J)^-1.

J = res_lsq.jac
cov = np.linalg.inv(J.T.dot(J))

要找到参数的方差,然后可以使用:

To find the variance of the parameters one can then use:

var = np.sqrt(np.diagonal(cov))

这篇关于如何使用 scipy.optimize.least_squares 计算标准偏差误差的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆