在 TensorFlow 中,tf.identity 是做什么用的? [英] In TensorFlow, what is tf.identity used for?

查看:32
本文介绍了在 TensorFlow 中,tf.identity 是做什么用的?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我看到在一些地方使用了 tf.identity,例如官方的 CIFAR-10 教程和 stackoverflow 上的批处理规范化实现,但我不明白为什么它是必要的.

I've seen tf.identity used in a few places, such as the official CIFAR-10 tutorial and the batch-normalization implementation on stackoverflow, but I don't see why it's necessary.

它有什么用?谁能提供一两个用例?

What's it used for? Can anyone give a use case or two?

一个建议的答案是它可以用于 CPU 和 GPU 之间的传输.这对我来说不清楚.问题的扩展,基于this: loss = Tower_loss(scope) 在 GPU 块下,这表明 tower_loss 中定义的所有算子都映射到 GPU.然后,在 tower_loss 的末尾,我们在返回之前看到 total_loss = tf.identity(total_loss).为什么?在这里不使用 tf.identity 会有什么缺陷?

One proposed answer is that it can be used for transfer between the CPU and GPU. This is not clear to me. Extension to the question, based on this: loss = tower_loss(scope) is under the GPU block, which suggests to me that all operators defined in tower_loss are mapped to the GPU. Then, at the end of tower_loss, we see total_loss = tf.identity(total_loss) before it's returned. Why? What would be the flaw with not using tf.identity here?

推荐答案

经过一番摸索之后,我想我已经注意到一个适用于我见过的所有示例的用例.如果有其他用例,请举例说明.

After some stumbling I think I've noticed a single use case that fits all the examples I've seen. If there are other use cases, please elaborate with an example.

用例:

假设您想在每次评估特定变量时运行运算符.例如,假设您想在每次评估变量 y 时向 x 添加一个.看起来这会起作用:

Suppose you'd like to run an operator every time a particular Variable is evaluated. For example, say you'd like to add one to x every time the variable y is evaluated. It might seem like this will work:

x = tf.Variable(0.0)
x_plus_1 = tf.assign_add(x, 1)

with tf.control_dependencies([x_plus_1]):
    y = x
init = tf.initialize_all_variables()

with tf.Session() as session:
    init.run()
    for i in xrange(5):
        print(y.eval())

它不会:它会打印 0, 0, 0, 0, 0.相反,我们似乎需要在 control_dependencies 块中向图中添加一个新节点.所以我们使用这个技巧:

It doesn't: it'll print 0, 0, 0, 0, 0. Instead, it seems that we need to add a new node to the graph within the control_dependencies block. So we use this trick:

x = tf.Variable(0.0)
x_plus_1 = tf.assign_add(x, 1)

with tf.control_dependencies([x_plus_1]):
    y = tf.identity(x)
init = tf.initialize_all_variables()

with tf.Session() as session:
    init.run()
    for i in xrange(5):
        print(y.eval())

这是有效的:它打印 1、2、3、4、5.

This works: it prints 1, 2, 3, 4, 5.

如果在 CIFAR-10 教程中我们删除了 tf.identity,那么 loss_averages_op 将永远不会运行.

If in the CIFAR-10 tutorial we dropped tf.identity, then loss_averages_op would never run.

这篇关于在 TensorFlow 中,tf.identity 是做什么用的?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆