Matlab和Python中的优化算法(狗腿信任区域) [英] Optimization algorithm (dog-leg trust-region) in Matlab and Python

查看:595
本文介绍了Matlab和Python中的优化算法(狗腿信任区域)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Matlab和Python中的狗腿信任区域算法来求解一组非线性方程.

I'm trying to solve a set of nonlinear equations using the dog-leg trust-region algorithm in Matlab and Python.

在Matlab中,有 fsolve ,该算法是默认算法,而对于Python,我们在

In Matlab there is fsolve where this algorithm is the default, whereas for Python we specify 'dogleg' in scipy.optimize.minimize. I won't need to specify a Jacobian or Hessian for the Matlab whereas Python needs either one to solve the problem.

我没有Jacobian/Hessian,所以对于Python,有没有办法解决这个问题?还是在fsolve中有另一个函数执行与Matlab的dog-leg方法等效的功能?

I don't have the Jacobian/Hessian so is there a way around this issue for Python? Or is there another function that performs the equivalent of Matlab's dog-leg method in fsolve?

推荐答案

在较新版本的scipy中,有

In newer versions of scipy there is the approx_fprime function. It computes a numerical approximation of the jacobian of function f at position xk using the foward step finite difference. It returns an ndarray with the partial derivate of f at positions xk.

如果无法升级scipy版本,则始终可以从

If you can't upgrade your version of scipy, you can always copy the implementation from scipy's source.

scipy.optimize.minimize在内部调用approx_fprime.因此,在您的情况下,只需执行以下操作即可:

scipy.optimize.minimize calls approx_fprime internally if the input jac=False. So in your case, it should be enough to do the following:

scipy.optimize.minimize(fun, x0, args, method='dogleg', jac=False)


编辑


Edit

scipy似乎无法正确处理jac=False条件,因此有必要使用approx_fprime如下构建可调用的jac

scipy does not seem to handle the jac=False condition properly so it is necessary to build a callable jac using approx_fprime as follows

jac = lambda x,*args: scipy.optimize.approx_fprime(x,fun,epsilon,*args)
scipy.optimize.minimize(fun, x0, args, method='dogleg', jac=jac)

这篇关于Matlab和Python中的优化算法(狗腿信任区域)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆