从 tf.train.AdamOptimizer 获取当前学习率 [英] Getting the current learning rate from a tf.train.AdamOptimizer

查看:103
本文介绍了从 tf.train.AdamOptimizer 获取当前学习率的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想打印出我的神经网络每个训练步骤的学习率.

I'd like print out the learning rate for each training step of my nn.

我知道 Adam 有一个自适应学习率,但有没有办法让我看到这一点(用于张量板中的可视化)

I know that Adam has an adaptive learning rate, but is there a way i can see this (for visualization in tensorboard)

推荐答案

所有优化器都有一个私有变量,用于保存学习率的值.

All the optimizers have a private variable that holds the value of a learning rate.

adagrad梯度下降它被称为 self._learning_rate.在 adam 中,它是self._lr.

In adagrad and gradient descent it is called self._learning_rate. In adam it is self._lr.

所以你只需要打印 sess.run(optimizer._lr) 来获得这个值.需要 Sess.run,因为它们是张量.

So you will just need to print sess.run(optimizer._lr) to get this value. Sess.run is needed because they are tensors.

这篇关于从 tf.train.AdamOptimizer 获取当前学习率的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆