如何在 tf.layers 中使用张量板? [英] How do I use tensor board with tf.layers?

查看:29
本文介绍了如何在 tf.layers 中使用张量板?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

由于没有明确定义权重,我如何将它们传递给摘要作者?

As the weights are not explicitly defined, how can I pass them to a summary writer?

例如:

conv1 = tf.layers.conv2d(
    tf.reshape(X,[FLAGS.batch,3,160,320]),
    filters = 16,
    kernel_size = (8,8),
    strides=(4, 4),
    padding='same',
    kernel_initializer=tf.contrib.layers.xavier_initializer(),
    bias_initializer=tf.zeros_initializer(),
    kernel_regularizer=None,
    name = 'conv1',
    activation = tf.nn.elu
    )

=>

summarize_tensor(
    ??????
)

谢谢!

推荐答案

这取决于您要在 TensorBoard 中记录的内容.如果你想把每个变量都放到 TensorBoard 中,调用 tf.all_variables()tf.trainable_variables() 会给你所有的变量.请注意, tf.layers.conv2d 只是创建 Conv2D 实例并调用它的 apply 方法的包装器.你可以像这样解开它:

That depends on what you are going to record in TensorBoard. If you want to put every variables into TensorBoard, call tf.all_variables() or tf.trainable_variables() will give you all the variables. Note that the tf.layers.conv2d is just a wrapper of creating a Conv2D instance and call apply method of it. You can unwrap it like this:

conv1_layer = tf.layers.Conv2D(
    filters = 16,
    kernel_size = (8,8),
    strides=(4, 4),
    padding='same',
    kernel_initializer=tf.contrib.layers.xavier_initializer(),
    bias_initializer=tf.zeros_initializer(),
    kernel_regularizer=None,
    name = 'conv1',
    activation = tf.nn.elu
)

conv1 = conv1_layer.apply(tf.reshape(X,[FLAGS.batch,3,160,320]))

然后你可以使用conv1_layer.kernel来访问内核权重.

Then you can use conv1_layer.kernel to access the kernel weights.

这篇关于如何在 tf.layers 中使用张量板?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆