Torch - 窄()没有内存复制 [英] Torch - narrow() without memory copy

查看:41
本文介绍了Torch - 窄()没有内存复制的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否有任何方法可以使用 :narrow 并避免复制?IE.:resize:reshape 的就地版本,有没有对应的缩小版?

Is there any way of using :narrow in place and avoiding having to make a copy? I.e. :resize is the in place version of :reshape, is there an equivalent for narrow?

推荐答案

如文档中所述,narrow 不执行内存复制:

As stated in the docs, narrow does not perform a memory copy:

对于方法 narrowselectsub,返回的张量与原始张量共享相同的 Storage.因此,子张量内存中的任何修改都会对主张量产生影响,反之亦然.这些方法非常快,因为它们不涉及任何内存复制.

For methods narrow, select and sub the returned tensor shares the same Storage as the original. Hence, any modification in the memory of the sub-tensor will have an impact on the primary tensor, and vice-versa. These methods are very fast, as they do not involve any memory copy.

示例:

th> x = torch.Tensor{{1, 2}, {3, 4}}

th> y = x:narrow(1, 2, 1)

th> print(x:storage():data())
cdata<double *>: 0x0079f240

th> print(y:storage():data())
cdata<double *>: 0x0079f240

他们只返回一个新的张量,即一个新的对象,它在幕后使用相同的存储.

They only return a new tensor, i.e. a new object that uses the same storage behind the scenes.

如果你真的想就地修改原始张量,你可以使用 set:

If you really want to modify the original tensor in-place you can use set:

th> x:set(y)
 3  4
[torch.DoubleTensor of size 1x2]

或者更简单的x = y.

这篇关于Torch - 窄()没有内存复制的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆