PyTorch 中的 .flatten() 和 .view(-1) 有什么区别? [英] What is the difference between .flatten() and .view(-1) in PyTorch?

查看:25
本文介绍了PyTorch 中的 .flatten() 和 .view(-1) 有什么区别?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

.flatten().view(-1) 都在 PyTorch 中展平张量.有什么区别?

Both .flatten() and .view(-1) flatten a tensor in PyTorch. What's the difference?

  1. .flatten() 会复制张量的数据吗?
  2. .view(-1) 是否更快?
  3. 是否存在 .flatten() 不起作用的情况?
  1. Does .flatten() copy the data of the tensor?
  2. Is .view(-1) faster?
  3. Is there any situation that .flatten() doesn't work?

推荐答案

除了@adeelh 的注释,还有一个区别:torch.flatten() 导致 .reshape(),以及 .reshape().view() 之间的区别 是:

In addition to @adeelh's comment, there is another difference: torch.flatten() results in a .reshape(), and the differences between .reshape() and .view() are:

  • [...] torch.reshape 可能会返回原始张量的副本或视图.您不能指望它返回视图或副本.

  • [...] torch.reshape may return a copy or a view of the original tensor. You can not count on that to return a view or a copy.

另一个区别是 reshape() 可以对连续和非连续张量进行操作,而 view() 只能对连续张量进行操作.另请参阅此处有关连续的含义

Another difference is that reshape() can operate on both contiguous and non-contiguous tensor while view() can only operate on contiguous tensor. Also see here about the meaning of contiguous

上下文:

  • The community requested for a flatten function for a while, and after Issue #7743, the feature was implemented in the PR #8578.

可以看到flatten的实现这里,在 return 行中可以看到对 .reshape() 的调用.

You can see the implementation of flatten here, where a call to .reshape() can be seen in return line.

这篇关于PyTorch 中的 .flatten() 和 .view(-1) 有什么区别?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆