如何通过变量/占位符的名称获取引用? [英] How to get reference by name of variable/placeholder?
问题描述
我指的是名字
tf.placeholder(tf.float32, name='NAME')
tf.get_variable("W", [n_in, n_out],initializer=w_init())
我有几个要从外部函数访问而无需传递引用的占位符,并假设存在持有给定名称的占位符,那么如何获得对它们的引用? (这全部是在图形构建期间,而不是运行时)
I have several placeholders which I want to access from outside functions without passing the reference, with the assumption that placeholders holding the given names exist how can you get a reference to them? (this is all during graph construction, not runtime)
我的第二个问题是无论范围如何,如何获取具有给定名称的所有变量?
And my second question is how can I get all variables that hold a given name no matter the scope?
示例:在许多范围内,我所有的砝码都以"W"命名,我希望将它们全部放入列表中.我不想手动添加每个.可以使用偏见进行相同的操作,可以说我想做一个直方图.
Example: All my weights have the name "W" under many scopes, I want to get them all into a list. I do not want to add each one manually. The same can be done with the biases, lets say I want to do a histogram.
推荐答案
First of all, you can get the placeholder using tf.Graph.get_tensor_by_name(). For example, assuming that you are working with the default graph:
placeholder1 = tf.placeholder(tf.float32, name='NAME')
placeholder2 = tf.get_default_graph().get_tensor_by_name('NAME:0')
assert placeholder1 == placeholder2
第二,我将使用以下函数来获取具有给定名称的所有变量(无论其范围如何):
Secondly, I would use the following function to get all variables with a given name (no matter their scope):
def get_all_variables_with_name(var_name):
name = var_name + ':0'
return [var for var in tf.all_variables() if var.name.endswith(name)]
这篇关于如何通过变量/占位符的名称获取引用?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!