如何显示scipy.optimize函数的进度? [英] How to display progress of scipy.optimize function?
问题描述
我使用scipy.optimize
来最小化12个参数的函数.
I use scipy.optimize
to minimize a function of 12 arguments.
我前一段时间开始优化,但仍在等待结果.
I started the optimization a while ago and still waiting for results.
是否有一种方法可以强制scipy.optimize
显示其进度(例如已经完成了多少,当前的最佳点是什么)?
Is there a way to force scipy.optimize
to display its progress (like how much is already done, what are the current best point)?
推荐答案
正如mg007所建议的,某些scipy.optimize例程允许使用回调函数(不幸的是,smastsq目前不允许这样做).下面是使用"fmin_bfgs"例程的示例,在该例程中,我使用回调函数在每次迭代时显示参数的当前值和目标函数的值.
As mg007 suggested, some of the scipy.optimize routines allow for a callback function (unfortunately leastsq does not permit this at the moment). Below is an example using the "fmin_bfgs" routine where I use a callback function to display the current value of the arguments and the value of the objective function at each iteration.
import numpy as np
from scipy.optimize import fmin_bfgs
Nfeval = 1
def rosen(X): #Rosenbrock function
return (1.0 - X[0])**2 + 100.0 * (X[1] - X[0]**2)**2 + \
(1.0 - X[1])**2 + 100.0 * (X[2] - X[1]**2)**2
def callbackF(Xi):
global Nfeval
print '{0:4d} {1: 3.6f} {2: 3.6f} {3: 3.6f} {4: 3.6f}'.format(Nfeval, Xi[0], Xi[1], Xi[2], rosen(Xi))
Nfeval += 1
print '{0:4s} {1:9s} {2:9s} {3:9s} {4:9s}'.format('Iter', ' X1', ' X2', ' X3', 'f(X)')
x0 = np.array([1.1, 1.1, 1.1], dtype=np.double)
[xopt, fopt, gopt, Bopt, func_calls, grad_calls, warnflg] = \
fmin_bfgs(rosen,
x0,
callback=callbackF,
maxiter=2000,
full_output=True,
retall=False)
输出看起来像这样:
The output looks like this:
Iter X1 X2 X3 f(X)
1 1.031582 1.062553 1.130971 0.005550
2 1.031100 1.063194 1.130732 0.004973
3 1.027805 1.055917 1.114717 0.003927
4 1.020343 1.040319 1.081299 0.002193
5 1.005098 1.009236 1.016252 0.000739
6 1.004867 1.009274 1.017836 0.000197
7 1.001201 1.002372 1.004708 0.000007
8 1.000124 1.000249 1.000483 0.000000
9 0.999999 0.999999 0.999998 0.000000
10 0.999997 0.999995 0.999989 0.000000
11 0.999997 0.999995 0.999989 0.000000
Optimization terminated successfully.
Current function value: 0.000000
Iterations: 11
Function evaluations: 85
Gradient evaluations: 17
至少您可以通过这种方式观察优化器跟踪最小值的情况
At least this way you can watch as the optimizer tracks the minimum
这篇关于如何显示scipy.optimize函数的进度?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!