“获取变量"是什么意思?在 TensorFlow 中? [英] What does it mean to "get variable" in TensorFlow?
问题描述
我多次看到以下声明:
tf.get_variable 从图中获取现有变量
tf.get_variable gets from the graph an existing variable
具体是什么意思?如果一个变量已经存在,为什么我不能通过调用它的名字来重用它(或获取它").例如,我有一个变量 x
,所以我只在需要的地方使用这个变量.为什么我应该使用 get_variable
来获取"它(无论它意味着什么)?
What exactly does it mean? If a variable already exists, why can't I reuse it (or "get it") by calling its name. For example, I have a variable x
, so I just use this variable everywhere I need it. Why should I use get_variable
to "get" it (whatever it means)?
推荐答案
如果一个变量已经存在,为什么我不能通过调用它的名字来重用它(或获取它").
If a variable already exists, why can't I reuse it (or "get it") by calling its name.
您可以并且通常在整个模型都在一个文件中时完成.但是,大模型很可能会拆分为不同的源文件和库.在这种情况下,tf.get_variable
只是方便:tf.get_variable
还允许您重用先前创建的同名变量,从而轻松定义重用层的模型.
You can and that's usually done when the whole model is in one file. However, the big model is likely to be split into different source files and libraries. In this case, tf.get_variable
is simply convenient: "tf.get_variable
also allows you to reuse a previously created variable of the same name, making it easy to define models which reuse layers".
tensorflow 中开箱即用的层和函数通常使用 tf.get_variable
定义它们的变量,例如 tf.contrib.crf.crf_log_likelihood
(源代码),它允许客户端传递一个转换
矩阵,即使 crf_log_likelihood
调用在另一个模块中,甚至在第三方代码中.
Out-of-the-box layers and functions in tensorflow often define their variables with tf.get_variable
, for example tf.contrib.crf.crf_log_likelihood
(source code), which allows the client to pass a transitions
matrix even if crf_log_likelihood
invocation is in another module or even in third-party code.
共享的可能性是另一个用例,正如评论中已经建议的那样,因此在层内编写 tf.get_variable
是朝着更好的组合性迈出的一步.
The possibility of sharing is another use-case, as already suggested in the comments, so writing tf.get_variable
deep within a layer is a step towards better compositionality.
这篇关于“获取变量"是什么意思?在 TensorFlow 中?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!