如何理解TensorFlow中的术语``张量''? [英] How to understand the term `tensor` in TensorFlow?

查看:98
本文介绍了如何理解TensorFlow中的术语``张量''?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是TensorFlow的新手.在阅读现有文档时,我发现tensor一词确实令人困惑.因此,我需要澄清以下问题:

I am new to TensorFlow. While I am reading the existing documentation, I found the term tensor really confusing. Because of it, I need to clarify the following questions:

  1. tensorVariabletensor
    之间的关系是什么 与tf.constant,张量"与tf.placeholder?
  2. 它们是所有类型的张量吗?
  1. What is the relationship between tensor and Variable, tensor
    vs. tf.constant, 'tensor' vs. tf.placeholder?
  2. Are they all types of tensors?

推荐答案

TensorFlow没有一流的Tensor对象,这意味着运行时所执行的基础图中没有Tensor的概念.相反,该图由相互连接的op节点组成,代表操作.一个操作为其输出分配内存,这些输出在端点:0:1等上可用,您可以将这些端点中的每一个视为Tensor.如果具有与nodename:0相对应的tensor,则可以将其值提取为sess.run(tensor)sess.run('nodename:0').执行粒度发生在操作级别,因此run方法将执行op,该op将计算所有端点,而不仅仅是:0端点.可能有一个没有输出的Op节点(例如tf.group),在这种情况下,没有与之关联的张量.没有基础Op节点的张量是不可能的.

TensorFlow doesn't have first-class Tensor objects, meaning that there are no notion of Tensor in the underlying graph that's executed by the runtime. Instead the graph consists of op nodes connected to each other, representing operations. An operation allocates memory for its outputs, which are available on endpoints :0, :1, etc, and you can think of each of these endpoints as a Tensor. If you have tensor corresponding to nodename:0 you can fetch its value as sess.run(tensor) or sess.run('nodename:0'). Execution granularity happens at operation level, so the run method will execute op which will compute all of the endpoints, not just the :0 endpoint. It's possible to have an Op node with no outputs (like tf.group) in which case there are no tensors associated with it. It is not possible to have tensors without an underlying Op node.

您可以通过执行以下操作来检查基础图中发生了什么

You can examine what happens in underlying graph by doing something like this

tf.reset_default_graph()
value = tf.constant(1)
print(tf.get_default_graph().as_graph_def())

因此,使用tf.constant您将获得一个操作节点,并且可以使用sess.run("Const:0")sess.run(value)

So with tf.constant you get a single operation node, and you can fetch it using sess.run("Const:0") or sess.run(value)

类似地,value=tf.placeholder(tf.int32)创建一个名称为Placeholder的常规节点,您可以将其作为feed_dict={"Placeholder:0":2}feed_dict={value:2}进行输入.您不能在同一session.run调用中提供和获取占位符,但可以通过在顶部附加tf.identity节点并将其获取来查看结果.

Similarly, value=tf.placeholder(tf.int32) creates a regular node with name Placeholder, and you could feed it as feed_dict={"Placeholder:0":2} or feed_dict={value:2}. You can not feed and fetch a placeholder in the same session.run call, but you can see the result by attaching a tf.identity node on top and fetching that.

对于变量

tf.reset_default_graph()
value = tf.Variable(tf.ones_initializer()(()))
value2 = value+3
print(tf.get_default_graph().as_graph_def())

您将看到它创建了两个节点VariableVariable/read:0端点是要在这两个节点上访存的有效值.但是Variable:0具有特殊的ref类型,这意味着它可以用作变异操作的输入. Python调用tf.Variable的结果是一个Python Variable对象,并且根据是否需要进行突变,有一些Python魔术可以替代Variable/read:0Variable:0.由于大多数操作只有1个端点,因此:0被删除.另一个示例是Queue-close()方法将创建一个新的Close op节点,该节点连接到Queue op.总结-对VariableQueue之类的python对象的操作根据使用情况映射到不同的基础TensorFlow op节点.

You'll see that it creates two nodes Variable and Variable/read, the :0 endpoint is a valid value to fetch on both of these nodes. However Variable:0 has a special ref type meaning it can be used as an input to mutating operations. The result of Python call tf.Variable is a Python Variable object and there's some Python magic to substitute Variable/read:0 or Variable:0 depending on whether mutation is necessary. Since most ops have only 1 endpoint, :0 is dropped. Another example is Queue -- close() method will create a new Close op node which connects to Queue op. To summarize -- operations on python objects like Variable and Queue map to different underlying TensorFlow op nodes depending on usage.

对于像tf.splittf.nn.top_k这样的操作会创建具有多个端点的节点,Python的session.run调用会自动将输出包装在tuplecollections.namedtupleTensor对象中,这些对象可以单独获取.

For ops like tf.split or tf.nn.top_k which create nodes with multiple endpoints, Python's session.run call automatically wraps output in tuple or collections.namedtuple of Tensor objects which can be fetched individually.

这篇关于如何理解TensorFlow中的术语``张量''?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆