使用scipy,python,numpy的非线性e ^(-x)回归 [英] Nonlinear e^(-x) regression using scipy, python, numpy

查看:125
本文介绍了使用scipy,python,numpy的非线性e ^(-x)回归的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

下面的代码为我提供了一条最适合的直线,而不是沿着e ^(-x)模型的一条曲线拟合数据.谁能告诉我如何修复下面的代码,以使其适合我的数据?

The code below is giving me a flat line for the line of best fit rather than a nice curve along the model of e^(-x) that would fit the data. Can anyone show me how to fix the code below so that it fits my data?

import numpy as np  
import matplotlib.pyplot as plt
import scipy.optimize

def _eNegX_(p,x):
    x0,y0,c,k=p  
    y = (c * np.exp(-k*(x-x0))) + y0
    return y

def _eNegX_residuals(p,x,y):
    return y - _eNegX_(p,x)

def Get_eNegX_Coefficients(x,y):
    print 'x is:  ',x  
    print 'y is:  ',y 

    # Calculate p_guess for the vectors x,y.  Note that p_guess is the
    # starting estimate for the minimization.
    p_guess=(np.median(x),np.min(y),np.max(y),.01)

    # Calls the leastsq() function, which calls the residuals function with an initial 
    # guess for the parameters and with the x and y vectors.  Note that the residuals
    # function also calls the _eNegX_ function.  This will return the parameters p that
    # minimize the least squares error of the _eNegX_ function with respect to the original
    # x and y coordinate vectors that are sent to it.
    p, cov, infodict, mesg, ier = scipy.optimize.leastsq(  
        _eNegX_residuals,p_guess,args=(x,y),full_output=1,warning=True)

    # Define the optimal values for each element of p that were returned by the leastsq() function. 
    x0,y0,c,k=p  
    print('''Reference data:\  
    x0 = {x0}
    y0 = {y0}
    c = {c}
    k = {k}
    '''.format(x0=x0,y0=y0,c=c,k=k))  

    print 'x.min() is:  ',x.min()
    print 'x.max() is:  ',x.max()
    # Create a numpy array of x-values
    numPoints = np.floor((x.max()-x.min())*100)
    xp = np.linspace(x.min(), x.max(), numPoints)
    print 'numPoints is:  ',numPoints
    print 'xp is:  ',xp
    print 'p is:  ',p
    pxp=_eNegX_(p,xp)
    print 'pxp is:  ',pxp

    # Plot the results  
    plt.plot(x, y, '>', xp, pxp, 'g-')
    plt.xlabel('BPM%Rest') 
    plt.ylabel('LVET/BPM',rotation='vertical')
    plt.xlim(0,3)
    plt.ylim(0,4)
    plt.grid(True) 
    plt.show()

    return p

# Declare raw data for use in creating regression equation 
x = np.array([1,1.425,1.736,2.178,2.518],dtype='float')  
y = np.array([3.489,2.256,1.640,1.043,0.853],dtype='float')  

p=Get_eNegX_Coefficients(x,y)

推荐答案

您最初的猜测似乎有问题;像(1,1,1,1)这样的东西可以正常工作:
你有

It looks like it's a problem with your initial guesses; something like (1, 1, 1, 1) works fine:
You have

p_guess=(np.median(x),np.min(y),np.max(y),.01)

该功能

def _eNegX_(p,x):
    x0,y0,c,k=p  
    y = (c * np.exp(-k*(x-x0))) + y0
    return y

这就是test_data_max e ^(-.01 (x-test_data_median))+ test_data_min

So that's test_data_maxe^( -.01(x - test_data_median)) + test_data_min

我对选择良好的起始参数的技巧了解不多,但是我可以说几句话. leastsq在这里找到局部最小值-选择这些值的关键是找到合适的爬山点,而不是试图减少最小化算法必须完成的工作.您最初的猜测是这样的(green): (1.736, 0.85299999999999998, 3.4889999999999999, 0.01)

I don't know much about the art of choosing good starting parameters, but I can say a few things. leastsq is finding a local minimum here - the key in choosing these values is to find the right mountain to climb, not to try to cut down on the work that the minimization algorithm has to do. Your initial guess looks like this (green): (1.736, 0.85299999999999998, 3.4889999999999999, 0.01)

这将导致您的扁平线(蓝色): (-59.20295956, 1.8562 , 1.03477144, 0.69483784)

which results in your flat line (blue): (-59.20295956, 1.8562 , 1.03477144, 0.69483784)

与增加k值相比,调整线的高度可获得更大的收益.如果您知道适合这种数据,请使用较大的k.如果您不知道,我想您可以尝试通过采样数据或从上半年和下半年平均值之间的斜率开始寻找合适的k值,但我不知道该怎么做关于这个.

Greater gains were made in adjusting the height of the line than in increasing the k value. If you know you're fitting to this kind of data, use a larger k. If you don't know, I guess you could try to find a decent k value by sampling your data, or working back from the slope between an average of the first half and the second half, but I wouldn't know how to go about that.

您还可以从几个猜测开始,多次进行最小化,然后选择残差最小的行.

You could also start with several guesses, run the minimization several times, and take the line with the lowest residuals.

这篇关于使用scipy,python,numpy的非线性e ^(-x)回归的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆