标准化以优化python [英] Normalization for optimization in python

查看:204
本文介绍了标准化以优化python的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在优化过程中,将输入参数规格化以使其处于相同的数量级通常很有帮助,因此收敛性会好得多.例如,如果我们要最小化f(x),而合理的近似值为x0 = [1e3,1e-4],则将x0 [0]和x0 [1]标准化为大约相同的数量级可能会有所帮助(通常为O(1)).

During optimization, it is often helpful to normalize the input parameters to make them on the same order of magnitude, so the convergence can be much better. For example, if we want to minimize f(x), while a reasonable approximation is x0=[1e3, 1e-4], it might be helpful to normalize x0[0] and x0[1] to about the same order of magnitude (often O(1)).

我的问题是,我一直在使用scipy.optimize,尤其是L-BFGS-B算法.我想知道,我是否需要通过编写函数来手动将其标准化,或者算法已经为我做到了?

My question is, I have been using scipy.optimize and specifically, the L-BFGS-B algorithm. I was wondering that, do I need to normalize that manually by writing a function, or the algorithm already did it for me?

谢谢!

推荐答案

我编写了一个快速的小程序来测试您的问题.

I wrote a quick small program to test your question.

总结:如果参数彼此之间在几个数量级之内,则算法将对其进行处理(它成功收敛并且不需要进行更多的功能评估).

To summarize: if the parameters are within a couple orders of magnitude of each other, then the algorithm handles it (it successfully converges and does not need to do significantly more function evaluations).

另一方面,当您开始超过10000时,该算法开始崩溃并出错.

On the other hand, when you start getting beyond a factor of 10000, the algorithm starts to break down and errors out.

这是程序:

import scipy.optimize


def test_lbfgsb():
    def surface(x):
        return (x[0] - 3.0) ** 2 + (factor * x[1] - 4.0) ** 2

    factor = None
    for exponent in xrange(0, 9):
        magnitude = 10 ** exponent
        factors = [x * magnitude for x in [1, 3]]
        for factor in factors:
            optimization_result = scipy.optimize.minimize(surface, [0, 0], method='l-bfgs-b')
            desc = 'at factor %d' % (factor)
            if not optimization_result.success:
                print '%s FAILURE (%s)' % (desc, optimization_result.message)
            else:
                print '%s, found min at %s, after %d evaluations' % (
                    desc, optimization_result.x, optimization_result.nfev)


test_lbfgsb()

这是它的输出:

at factor 1, found min at [ 3.00000048  4.00000013], after 12 evaluations
at factor 3, found min at [ 2.99999958  1.33333351], after 36 evaluations
at factor 10, found min at [ 3.00000059  0.39999999], after 28 evaluations
at factor 30, found min at [ 2.99999994  0.13333333], after 36 evaluations
at factor 100, found min at [ 3.00000013  0.03999999], after 40 evaluations
at factor 300, found min at [ 3.          0.01333333], after 52 evaluations
at factor 1000, found min at [ 3.          0.00399999], after 64 evaluations
at factor 3000, found min at [  3.00000006e+00   1.33332833e-03], after 72 evaluations
at factor 10000, found min at [  3.00002680e+00   3.99998309e-04], after 92 evaluations
at factor 30000, found min at [  3.00000002e+00   1.33328333e-04], after 104 evaluations
at factor 100000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 300000, found min at [  3.00013621e+00   1.33292531e-05], after 196 evaluations
at factor 1000000, found min at [  3.00000348e-12   3.99500004e-06], after 60 evaluations
at factor 3000000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 10000000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 30000000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 100000000 FAILURE (ABNORMAL_TERMINATION_IN_LNSRCH)
at factor 300000000, found min at [  3.33333330e-17   8.33333350e-09], after 72 evaluations

这篇关于标准化以优化python的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆