.data在pytorch中仍然有用吗? [英] Is .data still useful in pytorch?
问题描述
我是pytorch的新手.我读了很多pytorch代码,这些代码大量使用了张量的.data
成员.但是我在官方文档和Google中搜索.data
,却发现很少.我想.data
包含张量中的数据,但我不知道何时需要它,何时不需要它?
I'm new to pytorch. I read much pytorch code which heavily uses tensor's .data
member. But I search .data
in the official document and Google, finding little. I guess .data
contains the data in the tensor, but I don't know when we need it and when not?
推荐答案
.data
是Variable
的属性(对象表示Tensor
具有历史跟踪功能,例如用于自动更新),而不是Tensor
.实际上,.data
允许访问Variable
的基础Tensor
.
.data
was an attribute of Variable
(object representing Tensor
with history tracking e.g. for automatic update), not Tensor
. Actually, .data
was giving access to the Variable
's underlying Tensor
.
但是,由于PyTorch版本0.4.0
,Variable
和Tensor
已被合并(成为更新的Tensor
结构),所以.data
沿着先前的Variable
对象消失了(Variable
是仍然具有向后兼容性,但已弃用).
However, since PyTorch version 0.4.0
, Variable
and Tensor
have been merged (into an updated Tensor
structure), so .data
disappeared along the previous Variable
object (well Variable
is still there for backward-compatibility, but is deprecated).
来自版本0.4.0
的发行说明的段落(我建议阅读有关Variable
/Tensor
更新的整个部分):
Paragraph from Release Notes for version 0.4.0
(I recommend reading the whole section about Variable
/Tensor
updates):
.data
怎么样?
What about
.data
?
.data
是从计算机获取基础Tensor
的主要方法
Variable
.合并之后,调用y = x.data
仍然具有类似的功能
语义.因此,y
将是与以下对象共享相同数据的Tensor
x
,与x
的计算历史无关,并且具有
requires_grad=False
.
.data
was the primary way to get the underlying Tensor
from a
Variable
. After this merge, calling y = x.data
still has similar
semantics. So y
will be a Tensor
that shares the same data with
x
, is unrelated with the computation history of x
, and has
requires_grad=False
.
但是,在某些情况下.data
可能是不安全的. x.data
上的任何更改
不会被autograd
跟踪,并且计算出的梯度将是
如果在向后传递中需要x
,则不正确.一个更安全的选择是
使用x.detach()
,它还会返回共享数据的Tensor
与requires_grad=False
一起使用,但会就地更改
如果后向需要x
,则由autograd
报告.
However, .data
can be unsafe in some cases. Any changes on x.data
wouldn't be tracked by autograd
, and the computed gradients would be
incorrect if x
is needed in a backward pass. A safer alternative is
to use x.detach()
, which also returns a Tensor
that shares data
with requires_grad=False
, but will have its in-place changes
reported by autograd
if x
is needed in backward.
这篇关于.data在pytorch中仍然有用吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!