Matlab和Python中的优化算法(狗腿信任区域) [英] Optimization algorithm (dog-leg trust-region) in Matlab and Python
问题描述
我正在尝试使用Matlab和Python中的狗腿信任区域算法来求解一组非线性方程.
I'm trying to solve a set of nonlinear equations using the dog-leg trust-region algorithm in Matlab and Python.
在Matlab中,有 fsolve ,该算法是默认算法,而对于Python,我们在
In Matlab there is fsolve where this algorithm is the default, whereas for Python we specify 'dogleg' in scipy.optimize.minimize. I won't need to specify a Jacobian or Hessian for the Matlab whereas Python needs either one to solve the problem.
我没有Jacobian/Hessian,所以对于Python,有没有办法解决这个问题?还是在fsolve
中有另一个函数执行与Matlab的dog-leg方法等效的功能?
I don't have the Jacobian/Hessian so is there a way around this issue for Python? Or is there another function that performs the equivalent of Matlab's dog-leg method in fsolve
?
推荐答案
In newer versions of scipy there is the approx_fprime function. It computes a numerical approximation of the jacobian of function f
at position xk
using the foward step finite difference. It returns an ndarray with the partial derivate of f
at positions xk
.
If you can't upgrade your version of scipy, you can always copy the implementation from scipy's source.
scipy.optimize.minimize
在内部调用approx_fprime
.因此,在您的情况下,只需执行以下操作即可:
scipy.optimize.minimize
calls approx_fprime
internally if the input jac=False
. So in your case, it should be enough to do the following:
scipy.optimize.minimize(fun, x0, args, method='dogleg', jac=False)
编辑
Edit
scipy
似乎无法正确处理jac=False
条件,因此有必要使用approx_fprime
如下构建可调用的jac
scipy
does not seem to handle the jac=False
condition properly so it is necessary to build a callable jac
using approx_fprime
as follows
jac = lambda x,*args: scipy.optimize.approx_fprime(x,fun,epsilon,*args)
scipy.optimize.minimize(fun, x0, args, method='dogleg', jac=jac)
这篇关于Matlab和Python中的优化算法(狗腿信任区域)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!