TensorFlow 中 Variable 和 get_variable 的区别 [英] Difference between Variable and get_variable in TensorFlow
问题描述
据我所知,Variable
是创建变量的默认操作,而get_variable
主要用于权重共享.
As far as I know, Variable
is the default operation for making a variable, and get_variable
is mainly used for weight sharing.
一方面,有人建议在需要变量时使用 get_variable
代替原始的 Variable
操作.另一方面,我只在 TensorFlow 的官方文档和演示中看到了 get_variable
的任何使用.
On the one hand, there are some people suggesting using get_variable
instead of the primitive Variable
operation whenever you need a variable. On the other hand, I merely see any use of get_variable
in TensorFlow's official documents and demos.
因此我想知道一些关于如何正确使用这两种机制的经验法则.是否有任何标准"原则?
Thus I want to know some rules of thumb on how to correctly use these two mechanisms. Are there any "standard" principles?
推荐答案
我建议始终使用 tf.get_variable(...)
-- 这将使重构您的如果您需要随时共享变量,请编写代码,例如在多 GPU 设置中(请参阅多 GPU CIFAR 示例).它没有任何缺点.
I'd recommend to always use tf.get_variable(...)
-- it will make it way easier to refactor your code if you need to share variables at any time, e.g. in a multi-gpu setting (see the multi-gpu CIFAR example). There is no downside to it.
纯 tf.Variable
是低级的;在某些时候 tf.get_variable()
不存在所以一些代码仍然使用低级方式.
Pure tf.Variable
is lower-level; at some point tf.get_variable()
did not exist so some code still uses the low-level way.
这篇关于TensorFlow 中 Variable 和 get_variable 的区别的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!