Pytorch Tensor 存储在调用 storage() 方法时具有相同的 id [英] Pytorch Tensor storages have the same id when calling the storage() method

查看:40
本文介绍了Pytorch Tensor 存储在调用 storage() 方法时具有相同的 id的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在通过博客(以我的母语 - 越南语)学习张量存储,在对示例进行试验后,我发现了一些难以理解的内容.给定 3 个张量 xzzzx_t,如下所示:

I'm learning about tensor storage through a blog (in my native language - Viet), and after experimenting with the examples, I found something that was difficult to understand. Given 3 tensors x, zzz, and x_t as below:

import torch

x = torch.tensor([[3, 1, 2],
                  [4, 1, 7]])
zzz = torch.tensor([1,2,3])
# Transpose of the tensor x 
x_t = x.t()

当我将每个张量的存储设置为对应的变量时,那么它们的id就不同了:

When I set the storage of each tensor to the corresponding variable, then their ids are different from each other:

x_storage = x.storage()
x_t_storage = x_t.storage()
zzz_storage = zzz.storage()
print(id(x_storage), id(x_t_storage), id(zzz_storage)) 
print(x_storage.data_ptr())   
print(x_t_storage.data_ptr()) 

输出:

140372837772176 140372837682304 140372837768560
94914110126336
94914110126336

但是当我在同一个 print 语句中对每个原始张量调用 storage() 方法时,从所有张量观察到相同的输出,无论多少次我试过了:

But when I called the storage() method on each original tensor in the same print statement, the same outputs are observed from all tensors, no matter how many times I tried:

print(id(x.storage()), id(x_t.storage()), id(zzz.storage()))
# 140372837967904 140372837967904 140372837967904

当我将它们分别打印在不同的行上时,情况变得更加奇怪;有时他们的结果不同,有时他们的结果相同:

The situation gets even weirder as I print them separately on different lines; sometimes their results are different and sometimes theirs are the same:

print(id(x.storage()))
print(id(x_t.storage()))
# Output: 
# 140372837771776
# 140372837709856

所以我的问题是,为什么在第一种情况下存储的 id 之间存在差异,而在第二种情况下观察到相同的 id?(那个 id 是从哪里来的?).在第三种情况下发生了什么?

So my question is, why are there differences between the id of the storages in the first case, and the same id is observed in the second? (and where did that id come from?). And what is happening in the third case?

另外,我想问一下 data_ptr() 方法,因为在我在 id"https://discuss.pytorch.org/t/any-way-to-check-if-two-tensors-have-the-same-base/44310" rel="nofollow noreferrer">Pytorch 讨论,但 Pytorch 中的文档没有显示更多细节.如果有人能给我任何/所有问题的详细答案,我会很高兴.

Also, I want to ask about the method data_ptr(), as it was suggested to be used instead of id in one question I saw on Pytorch discuss, but the Docs in Pytorch just show no more detail. I would be glad if anyone can give me detailed answers to any/all of the questions.

推荐答案

在 Pytorch 讨论论坛和 Stack Overflow 上搜索后,我看到在比较中应该使用方法 data_ptr()张量的位置(根据 Python 在问题中讨论这个链接)(虽然不完全正确,请查看第一个 Python 讨论以获得更好的比较方法)

After searching on the Pytorch discuss forum and Stack Overflow, I see that the method data_ptr() should be used in the comparison of locations of tensors (according to the Python discuss in the question and this link) (although it is not totally correct, check the first Python discuss for a better comparison method)

关于 id 部分,Stack Overflow 上有很多关于这个主题的问题.我看到一个问题 这里有很多答案,可以解决上述问题的大部分内容.我对id和对象的内存分配也有一些误解,在我最近的问题

About the id part, there have been many questions on this topic on Stack Overflow. I saw one question here that has many answers which can clear up most part of the question above. I also have some misunderstanding on the id and the memory allocation of objects, which has also been answered in the comment section of my recent question

这篇关于Pytorch Tensor 存储在调用 storage() 方法时具有相同的 id的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆