如何在 Tensorflow 中使用多个摘要集合? [英] How to use several summary collections in Tensorflow?
本文介绍了如何在 Tensorflow 中使用多个摘要集合?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我有 2 组不同的摘要.一个每批次收集一次,另一个每个时期收集一次.如何使用 merge_all_summaries(key='???')
分别收集这两组中的摘要?手动执行始终是一种选择,但似乎有更好的方法.
I have 2 distinctive groups of summaries. One is collected once per batch another one is collected once per epoch. How can I use merge_all_summaries(key='???')
to collect summaries in this two groups separately? Doing it manually is always an option but there seems to be a better way.
我认为它应该如何工作的说明:
Illustration of how i think it should work:
# once per batch
tf.scalar_summary("loss", graph.loss)
tf.scalar_summary("batch_acc", batch_accuracy)
# once per epoch
gradients = tf.gradients(graph.loss, [W, D])
tf.histogram_summary("embedding/W", W, collections='per_epoch')
tf.histogram_summary("embedding/D", D, collections='per_epoch')
tf.merge_all_summaries() # -> (MergeSummary...) :)
tf.merge_all_summaries(key='per_epoch') # -> NONE :(
推荐答案
问题已解决.collections
摘要的参数应该是一个列表.解决方案:
Problem solved. collections
parameter of a summary is supposed to be a list.
Solution:
# once per batch
tf.scalar_summary("loss", graph.loss)
tf.scalar_summary("batch_acc", batch_accuracy)
# once per epoch
tf.histogram_summary("embedding/W", W, collections=['per_epoch'])
tf.histogram_summary("embedding/D", D, collections=['per_epoch'])
tf.merge_all_summaries() # -> (MergeSummary...) :)
tf.merge_all_summaries(key='per_epoch') # -> (MergeSummary...) :)
编辑.TF 中的语法变化:
Edit. Syntactical change in TF:
# once per batch
tf.summary.scalar("loss", graph.loss)
tf.summary.scalar("batch_acc", batch_accuracy)
# once per epoch
tf.summary.histogram("embedding/W", W, collections=['per_epoch'])
tf.summary.histogram("embedding/D", D, collections=['per_epoch'])
tf.summary.merge_all() # -> (MergeSummary...) :)
tf.summary.merge_all(key='per_epoch') # -> (MergeSummary...) :)
这篇关于如何在 Tensorflow 中使用多个摘要集合?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文