在 Tensorflow 中,变量和张量有什么区别? [英] In Tensorflow, what is the difference between a Variable and a Tensor?

查看:38
本文介绍了在 Tensorflow 中,变量和张量有什么区别?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Tensorflow 文档指出 Variable 可以在任何可以使用 Tensor 的地方使用,而且它们似乎可以互换.例如,如果 v 是一个 Variable,那么 x = 1.0 + v 就变成了一个 Tensor.

The Tensorflow documentation states that a Variable can be used any place a Tensor can be used, and they seem to be fairly interchangeable. For example, if v is a Variable, then x = 1.0 + v becomes a Tensor.

两者之间有什么区别,我什么时候会使用一个而不是另一个?

What is the difference between the two, and when would I use one over the other?

推荐答案

Variable 确实可以在 Tensor 可以使用的任何地方使用,但两者之间的主要区别在于 Variable 在多个运行调用中保持其状态() 并且变量的值可以通过反向传播更新(也可以根据文档进行保存、恢复等).

It's true that a Variable can be used any place a Tensor can, but the key differences between the two are that a Variable maintains its state across multiple calls to run() and a variable's value can be updated by backpropagation (it can also be saved, restored etc as per the documentation).

这些差异意味着您应该将变量视为代表模型的可训练参数(例如,神经网络的权重和偏差),而您可以将张量视为代表输入模型的数据以及该数据通过模型时的中间表示.

These differences mean that you should think of a variable as representing your model's trainable parameters (for example, the weights and biases of a neural network), while you can think of a Tensor as representing the data being fed into your model and the intermediate representations of that data as it passes through your model.

这篇关于在 Tensorflow 中,变量和张量有什么区别?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆