.data在pytorch中仍然有用吗? [英] Is .data still useful in pytorch?

查看:1508
本文介绍了.data在pytorch中仍然有用吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是pytorch的新手.我读了很多pytorch代码,这些代码大量使用了张量的.data成员.但是我在官方文档和Google中搜索.data,却发现很少.我想.data包含张量中的数据,但我不知道何时需要它,何时不需要它?

I'm new to pytorch. I read much pytorch code which heavily uses tensor's .data member. But I search .data in the official document and Google, finding little. I guess .data contains the data in the tensor, but I don't know when we need it and when not?

推荐答案

.dataVariable的属性(对象表示Tensor具有历史跟踪功能,例如用于自动更新),而不是Tensor.实际上,.data允许访问Variable的基础Tensor.

.data was an attribute of Variable (object representing Tensor with history tracking e.g. for automatic update), not Tensor. Actually, .data was giving access to the Variable's underlying Tensor.

但是,由于PyTorch版本0.4.0VariableTensor已被合并(成为更新的Tensor结构),所以.data沿着先前的Variable对象消失了(Variable是仍然具有向后兼容性,但已弃用).

However, since PyTorch version 0.4.0, Variable and Tensor have been merged (into an updated Tensor structure), so .data disappeared along the previous Variable object (well Variable is still there for backward-compatibility, but is deprecated).

来自版本0.4.0的发行说明的段落(我建议阅读有关Variable/Tensor更新的整个部分):

Paragraph from Release Notes for version 0.4.0 (I recommend reading the whole section about Variable/Tensor updates):

.data怎么样?

What about .data?

.data是从计算机获取基础Tensor的主要方法 Variable.合并之后,调用y = x.data仍然具有类似的功能 语义.因此,y将是与以下对象共享相同数据的Tensor x,与x的计算历史无关,并且具有 requires_grad=False.

.data was the primary way to get the underlying Tensor from a Variable. After this merge, calling y = x.data still has similar semantics. So y will be a Tensor that shares the same data with x, is unrelated with the computation history of x, and has requires_grad=False.

但是,在某些情况下.data可能是不安全的. x.data上的任何更改 不会被autograd跟踪,并且计算出的梯度将是 如果在向后传递中需要x,则不正确.一个更安全的选择是 使用x.detach(),它还会返回共享数据的Tensorrequires_grad=False一起使用,但会就地更改 如果后向需要x,则由autograd报告.

However, .data can be unsafe in some cases. Any changes on x.data wouldn't be tracked by autograd, and the computed gradients would be incorrect if x is needed in a backward pass. A safer alternative is to use x.detach(), which also returns a Tensor that shares data with requires_grad=False, but will have its in-place changes reported by autograd if x is needed in backward.

这篇关于.data在pytorch中仍然有用吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆