是否可以使可训练变量不可训练? [英] Is it possible to make a trainable variable not trainable?

查看:157
本文介绍了是否可以使可训练变量不可训练?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在范围中创建了一个 可训练的变量.后来,我输入了相同的作用域,将作用域设置为reuse_variables,并使用get_variable检索了相同的变量.但是,我不能将变量的可训练属性设置为False.我的get_variable行是这样的:

I created a trainable variable in a scope. Later, I entered the same scope, set the scope to reuse_variables, and used get_variable to retrieve the same variable. However, I cannot set the variable's trainable property to False. My get_variable line is like:

weight_var = tf.get_variable('weights', trainable = False)

但是变量'weights'仍在tf.trainable_variables的输出中.

But the variable 'weights' is still in the output of tf.trainable_variables.

我可以使用get_variable将共享变量的trainable标志设置为False吗?

Can I set a shared variable's trainable flag to False by using get_variable?

我要这样做的原因是,我试图重用模型中从VGG net预训练的低级过滤器,并且希望像以前一样构建图,检索权重变量,然后分配VGG将值过滤到权重变量,然后在接下来的训练步骤中将其保持固定.

The reason I want to do this is that I'm trying to reuse the low-level filters pre-trained from VGG net in my model, and I want to build the graph like before, retrieve the weights variable, and assign VGG filter values to the weight variable, and then keep them fixed during the following training step.

推荐答案

查看文档和代码后,我无法找到一种方法来从TRAINABLE_VARIABLES中删除变量

After looking at the documentation and the code, I was not able to find a way to remove a Variable from the TRAINABLE_VARIABLES.

  • 第一次调用tf.get_variable('weights', trainable=True)时,该变量将添加到TRAINABLE_VARIABLES的列表中.
  • 第二次调用tf.get_variable('weights', trainable=False)时,将获得相同的变量,但参数trainable=False无效,因为该变量已经存在于TRAINABLE_VARIABLES列表中(并且无法删除) )
  • The first time tf.get_variable('weights', trainable=True) is called, the variable is added to the list of TRAINABLE_VARIABLES.
  • The second time you call tf.get_variable('weights', trainable=False), you get the same variable but the argument trainable=False has no effect as the variable is already present in the list of TRAINABLE_VARIABLES (and there is no way to remove it from there)

在调用优化程序的minimize方法时(请参见文档. ),您可以将var_list=[...]作为参数传递给您要优化的变量.

When calling the minimize method of the optimizer (see doc.), you can pass a var_list=[...] as argument with the variables you want to optimizer.

例如,如果要冻结VGG的最后两层以外的所有层,则可以在var_list中传递最后两层的权重.

For instance, if you want to freeze all the layers of VGG except the last two, you can pass the weights of the last two layers in var_list.

您可以使用tf.train.Saver()保存变量并稍后恢复它们(请参见本教程 ).

You can use a tf.train.Saver() to save variables and restore them later (see this tutorial).

  • 首先,您要使用所有可训练的变量来训练整个VGG模型.您可以通过调用saver.save(sess, "/path/to/dir/model.ckpt")将它们保存在检查点文件中.
  • 然后(在另一个文件中)使用 nontrainable 变量训练第二个版本.您加载先前存储在saver.restore(sess, "/path/to/dir/model.ckpt")中的变量.
  • First you train your entire VGG model with all trainable variables. You save them in a checkpoint file by calling saver.save(sess, "/path/to/dir/model.ckpt").
  • Then (in another file) you train the second version with non trainable variables. You load the variables previously stored with saver.restore(sess, "/path/to/dir/model.ckpt").

(可选)您可以决定仅将某些变量保存在检查点文件中.有关更多信息,请参见文档.

Optionally, you can decide to save only some of the variables in your checkpoint file. See the doc for more info.

这篇关于是否可以使可训练变量不可训练?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆