如何将学习率添加到摘要中? [英] How to add learning rate to summaries?

查看:23
本文介绍了如何将学习率添加到摘要中?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何监控 AdamOptimizer 的学习率?在 TensorBoard: Visualizing Learning 中说我需要

How do I monitor learning rate of AdamOptimizer? In TensorBoard: Visualizing Learning is said that I need

通过将 scalar_summary 操作附加到分别输出学习率和损失的节点来收集这些.

Collect these by attaching scalar_summary ops to the nodes that output the learning rate and loss respectively.

我该怎么做?

推荐答案

我认为像下面这样的图形会很好:

I think something like following inside the graph would work fine:

with tf.name_scope("learning_rate"):
    global_step = tf.Variable(0)
    decay_steps = 1000 # setup your decay step
    decay_rate = .95 # setup your decay rate
    learning_rate = tf.train.exponential_decay(0.01, global_step, decay_steps, decay_rate, staircase=True, "learning_rate")
tf.scalar_summary('learning_rate', learning_rate)

(当然要让它工作,它需要 tf.merge_all_summaries() 并使用 tf.train.SummaryWriter 将摘要写入登录结束)

(Of course to make it work, it'd require to tf.merge_all_summaries() and use tf.train.SummaryWriter to write the summaries to the log in the end)

这篇关于如何将学习率添加到摘要中?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆