TensorBoard:如何绘制梯度的直方图? [英] TensorBoard: How to plot histogram for gradients?

查看:1197
本文介绍了TensorBoard:如何绘制梯度的直方图?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

TensorBoard具有在会话期间绘制张量直方图的功能.我想要训练时渐变的直方图.

TensorBoard had the function to plot histograms of Tensors at session-time. I want a histogram for the gradients during training.

tf.gradients(yvars,xvars)返回一个渐变列表.

但是,tf.histogram_summary('name',Tensor)仅接受张量,而不接受张量列表.

However, tf.histogram_summary('name',Tensor) accepts only Tensors, not lists of Tensors.

暂时,我做了一个解决方法.我将所有张量展平为列向量,并将它们连接起来:

For the time being, I made a work-around. I flatten all Tensors to a column vector and concatenate them:

for l in xrange(listlength): col_vec = tf.reshape(grads[l],[-1,1]) g = tf.concat(0,[g,col_vec]) grad_hist = tf.histogram_summary("name", g)

for l in xrange(listlength): col_vec = tf.reshape(grads[l],[-1,1]) g = tf.concat(0,[g,col_vec]) grad_hist = tf.histogram_summary("name", g)

绘制梯度直方图的更好方法是什么?

What would be a better way to plot the histogram for the gradient?

这似乎很平常,所以我希望TensorFlow对此具有专用功能.

It seems a common thing to do, so I hope TensorFlow would have a dedicated function for this.

推荐答案

另一种解决方案(基于

Another solution (based on this quora answer) is to access the gradients directly from the optimizer you are already using.

optimizer = tf.train.AdamOptimizer(..)
grads = optimizer.compute_gradients(loss)
grad_summ_op = tf.summary.merge([tf.summary.histogram("%s-grad" % g[1].name, g[0]) for g in grads])
grad_vals = sess.run(fetches=grad_summ_op, feed_dict = feed_dict)
writer['train'].add_summary(grad_vals)

这篇关于TensorBoard:如何绘制梯度的直方图?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆