如何在Keras中使用Adam优化器在每个时期打印学习率? [英] How can I print the Learning Rate at each epoch with Adam optimizer in Keras?

查看:1113
本文介绍了如何在Keras中使用Adam优化器在每个时期打印学习率?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

由于在使用自适应优化器时在线学习无法与Keras配合使用(调用.fit()时,学习率计划会重置),所以我想看看是否可以手动设置它.但是,为了做到这一点,我需要找出最后一个时期的学习率.

Because online learning does not work well with Keras when you are using an adaptive optimizer (the learning rate schedule resets when calling .fit()), I want to see if I can just manually set it. However, in order to do that, I need to find out what the learning rate was at the last epoch.

也就是说,我该如何打印每个时期的学习率?我想我可以通过回调来做到这一点,但是似乎您每次都必须重新计算它,而且我不确定如何与Adam一起使用.

That said, how can I print the learning rate at each epoch? I think I can do it through a callback but it seems that you have to recalculate it each time and I'm not sure how to do that with Adam.

我在另一个线程中找到了它,但它仅适用于SGD:

I found this in another thread but it only works with SGD:

class SGDLearningRateTracker(Callback):
    def on_epoch_end(self, epoch, logs={}):
        optimizer = self.model.optimizer
        lr = K.eval(optimizer.lr * (1. / (1. + optimizer.decay * optimizer.iterations)))
        print('\nLR: {:.6f}\n'.format(lr))

推荐答案

class MyCallback(Callback):
    def on_epoch_end(self, epoch, logs=None):
        lr = self.model.optimizer.lr
        # If you want to apply decay.
        decay = self.model.optimizer.decay
        iterations = self.model.optimizer.iterations
        lr_with_decay = lr / (1. + decay * K.cast(iterations, K.dtype(decay)))
        print(K.eval(lr_with_decay))

关注线程.

这篇关于如何在Keras中使用Adam优化器在每个时期打印学习率?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆