维基百科示例中高斯-牛顿方法的实现 [英] Implementation of the Gauss-Newton method from Wikipedia example

查看:64
本文介绍了维基百科示例中高斯-牛顿方法的实现的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我对 Python 比较陌生,正在尝试实现 Gauss-Newton 方法,特别是维基百科页面上的示例 (高斯-牛顿算法,3个例子).以下是我到目前为止所做的:

I'm relatively new to Python and am trying to implement the Gauss-Newton method, specifically the example on the Wikipedia page for it (Gauss–Newton algorithm, 3 example). The following is what I have done so far:

import scipy
import numpy as np
import math
import scipy.misc

from matplotlib import pyplot as plt, cm, colors

S = [0.038,0.194,.425,.626,1.253,2.500,3.740]
rate = [0.050,0.127,0.094,0.2122,0.2729,0.2665,0.3317]
iterations = 5
rows = 7
cols = 2

B = np.matrix([[.9],[.2]]) # original guess for B

Jf = np.zeros((rows,cols)) # Jacobian matrix from r
r = np.zeros((rows,1)) #r equations


def model(Vmax, Km, Sval):
   return ((vmax * Sval) / (Km + Sval))

def partialDerB1(B2,xi):
   return round(-(xi/(B2+xi)),10)

def partialDerB2(B1,B2,xi):
   return round(((B1*xi)/((B2+xi)*(B2+xi))),10)

def residual(x,y,B1,B2):
   return (y - ((B1*x)/(B2+x)))


for i in range(0,iterations):

   sumOfResid=0
   #calculate Jr and r for this iteration.
   for j in range(0,rows):
      r[j,0] = residual(S[j],rate[j],B[0],B[1])
      sumOfResid = sumOfResid + (r[j,0] * r[j,0])
      Jf[j,0] = partialDerB1(B[1],S[j])
      Jf[j,1] = partialDerB2(B[0],B[1],S[j])

   Jft =  np.transpose(Jf)
   B = B + np.dot((np.dot(Jft,Jf)**-1),(np.dot(Jft,r)))

   print B

残差的平方和在每次迭代中增加而不是趋向于 0,并且我得到的 B 向量增加.

The sum of the squares of the residuals increases rather than tends towards 0 at each iteration and my resulting B vector increases.

我无法理解我的问题出在哪里,我们将不胜感激.

I'm having trouble understanding where my problem is, and any help would be appreciated.

推荐答案

你在测试版更新的代码中出错了:应该是

You go wrong in the code of beta update: it should be

B = B - np.dot(np.dot( inv(np.dot(Jft, Jf)), Jft), r)

代替**-1在矩阵上计算逆矩阵

instead of **-1 on the matrix to calculate the inverse matrix

import scipy
import numpy as np
from numpy.linalg import inv
import math
import scipy.misc

#from matplotlib import pyplot as plt, cm, colors

S = [0.038,0.194,.425,.626,1.253,2.500,3.740]
rate = [0.050,0.127,0.094,0.2122,0.2729,0.2665,0.3317]
iterations = 5
rows = 7
cols = 2

B = np.matrix([[.9],[.2]]) # original guess for B
print(B)

Jf = np.zeros((rows,cols)) # Jacobian matrix from r
r = np.zeros((rows,1)) #r equations


def model(Vmax, Km, Sval):
   return ((Vmax * Sval) / (Km + Sval))

def partialDerB1(B2,xi):
   return round(-(xi/(B2+xi)),10)

def partialDerB2(B1,B2,xi):
   return round(((B1*xi)/((B2+xi)*(B2+xi))),10)

def residual(x,y,B1,B2):
   return (y - ((B1*x)/(B2+x)))

#
for _ in xrange(iterations):

   sumOfResid=0
   #calculate Jr and r for this iteration.
   for j in xrange(rows):
      r[j,0] = residual(S[j],rate[j],B[0],B[1])
      sumOfResid += (r[j,0] * r[j,0])
      Jf[j,0] = partialDerB1(B[1],S[j])
      Jf[j,1] = partialDerB2(B[0],B[1],S[j])

   Jft =  Jf.T
   B -= np.dot(np.dot( inv(np.dot(Jft,Jf)),Jft),r)

   print B 

这篇关于维基百科示例中高斯-牛顿方法的实现的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆