TensorFlow 使用 tf.layers.conv2d 重用变量 [英] TensorFlow reuse variable with tf.layers.conv2d

查看:23
本文介绍了TensorFlow 使用 tf.layers.conv2d 重用变量的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图让 2 个卷积层共享相同的权重,但是,API 似乎不起作用.

I am trying to make 2 conv layers share the same weights, however, it seems the API does not work.

import tensorflow as tf

x = tf.random_normal(shape=[10, 32, 32, 3])

with tf.variable_scope('foo') as scope:
    conv1 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope)
    print(conv1.name)

    conv2 = tf.contrib.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, scope=scope)
    print(conv2.name)

打印出来

foo/foo/Relu:0
foo/foo_1/Relu:0

tf.contrib.layers.conv2d 更改为 tf.layers.conv2d 并不能解决问题.

Changing from tf.contrib.layers.conv2d to tf.layers.conv2d does not solve the problem.

tf.layers.conv2d 也有同样的问题:

import tensorflow as tf

x = tf.random_normal(shape=[10, 32, 32, 3])

conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv')
print(conv1.name)
conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv')
print(conv2.name)

给予

conv/BiasAdd:0
conv_2/BiasAdd:0

推荐答案

在您编写的代码中,变量确实在两个卷积层之间重用.试试这个:

In the code you wrote, variables do get reused between the two convolution layers. Try this :

import tensorflow as tf

x = tf.random_normal(shape=[10, 32, 32, 3])

conv1 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=None, name='conv')

conv2 = tf.layers.conv2d(x, 3, [2, 2], padding='SAME', reuse=True, name='conv')

print([x.name for x in tf.global_variables()])

# prints
# [u'conv/kernel:0', u'conv/bias:0']

请注意,只创建了一个权重和一个偏置张量.即使它们共享权重,层也不共享实际计算.因此,您会看到操作的两个不同名称.

Note that only one weight and one bias tensor has been created. Even though they share the weights, the layers do not share the actual computation. Hence you see the two different names for the operations.

这篇关于TensorFlow 使用 tf.layers.conv2d 重用变量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆