无法访问TensorFlow Adam优化器名称空间 [英] Can't access TensorFlow Adam optimizer namespace

查看:149
本文介绍了无法访问TensorFlow Adam优化器名称空间的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试了解GAN,并且正在通过示例.

I'm trying to learn about GANs and I'm working through the example here.

下面使用Adam优化器的代码给了我错误

The code below using the Adam optimizer gives me the error

"ValueError:变量d_w1/Adam/不存在,或者不是使用tf.get_variable()创建的.您是要在VarScope中设置复用= None吗?"

"ValueError: Variable d_w1/Adam/ does not exist, or was not created with tf.get_variable(). Did you mean to set reuse=None in VarScope?"

我正在使用TF 1.1.0

I'm using TF 1.1.0

d_loss_real = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=Dx, labels=tf.fill([batch_size, 1], 0.9)))
d_loss_fake = tf.reduce_mean(tf.nn.sigmoid_cross_entropy_with_logits(logits=Dg, labels=tf.zeros_like(Dg)))
d_loss = d_loss_real + d_loss_fake

tvars = tf.trainable_variables()

d_vars = [var for var in tvars if 'd_' in var.name]
g_vars = [var for var in tvars if 'g_' in var.name]



# Train the discriminator
# Increasing from 0.001 in GitHub version
with tf.variable_scope(tf.get_variable_scope(), reuse=False) as scope:

    # Next, we specify our two optimizers. In today’s era of deep learning, Adam seems to be the
    # best SGD optimizer as it utilizes adaptive learning rates and momentum. 
    # We call Adam's minimize function and also specify the variables that we want it to update.
    d_trainer_real = tf.train.AdamOptimizer(0.0001).minimize(d_loss_real, var_list=d_vars)
    d_trainer_fake = tf.train.AdamOptimizer(0.0001).minimize(d_loss_fake, var_list=d_vars)

我认为Adam优化器正在将变量放入其自己的命名空间中,但是由于某些原因,它们没有被初始化.我会在代码的稍后部分调用global_variables_initializer,如在github页面上所见.我正在检查文档,我认为这可能与我必须在其中放置某种reuse_variables()调用有关,但我不确定.

I think the Adam optimizer is taking the variables into its own namespace but for some reason they aren't initialized. I do call global_variables_initializer later in the code, as can be seen on the github page. I'm checking through the documentation, I think it may be related to me having to put some kind of reuse_variables() call in there, but I'm not sure.

非常感谢您的帮助.

推荐答案

您的ValueError是由于在variable_scope.reuse == True中创建新变量而引起的.

Your ValueError is caused by creating new variables within the variable_scope.reuse==True.

变量是由亚当创建的,当您调用亚当的最小化函数时,用于保存图形中每个可训练变量的动量.

Variables are created by Adam, when you call the minimize function of Adam, for saving momentums of each trainable variables in your graph.

实际上,代码"reuse = False"不能按预期工作.一旦将重用状态设置为True,重用状态就无法永远更改为False,并且重用状态将被其所有子作用域继承.

Actually, the code "reuse=False" DOES NOT work as you expected. The reuse state cannot change back to False forever once you set it to True, and the reuse state will be inherited by its all sub scopes.

with tf.variable_scope(tf.get_variable_scope(), reuse=False) as scope:
    assert tf.get_variable_scope().reuse == True

我想您已在邮政编码之前的某个位置将重用设置为True,因此默认变量为variable_scope.reuse == True.然后,为Adam创建一个新的variable_scope,但是,新作用域将继承默认作用域的重用状态.然后,Adam在状态复用== True的条件下创建变量,这会引发错误.

I guess you have set reuse to True somewhere before the post codes, thus the default variable_scope.reuse==True. Then you create a new variable_scope for Adam, however, new scope will inherit the reuse state of default scope. Then, Adam creates variable under state reuse==True, which raises an error.

解决方案是在设置variable_scope.reuse = True时,在图形的默认范围下添加一个子范围,然后默认范围.re​​use仍为False,Adam.minimize将起作用.

The solution is to add a sub scope under the graph's default scope when you set variable_scope.reuse=True, then the default scope.reuse is still False, and Adam.minimize will work.

这篇关于无法访问TensorFlow Adam优化器名称空间的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆