基于索引向量的张量约简 [英] Tensor reduction based off index vector
问题描述
作为一个例子,我有2个张量:A = [1;2;3;4;5;6;7]
和B = [2;3;2]
.我的想法是我想基于B来减少A-这样B的值代表如何求和A的值-这样B = [2;3;2]
意味着减少的A应该是前两个值,下一个3和最后一个2的总和: A' = [(1+2);(3+4+5);(6+7)]
.显然,B的总和应始终等于A的长度.我试图尽可能有效地做到这一点-最好是pytorch/python中包含的特定函数或矩阵运算.谢谢!
As an example, I have 2 tensors: A = [1;2;3;4;5;6;7]
and B = [2;3;2]
. The idea is that I want to reduce A based off B - such that B's values represent how to sum A's values- such that B = [2;3;2]
means the reduced A shall be the sum of the first 2 values, next 3, and last 2: A' = [(1+2);(3+4+5);(6+7)]
. It is apparent that the sum of B shall always be equal to the length of A. I'm trying to do this as efficiently as possible - preferably specific functions or matrix operations contained within pytorch/python. Thanks!
推荐答案
以下是解决方案.
- 首先,我们创建具有相同大小
A
的索引B_idx
数组. - 然后,使用
index_add_
根据索引B_idx
累积(添加)A
中的所有元素.
- First, we create an array of indices
B_idx
with the same size ofA
. - Then, accumulate (add) all elements in
A
based on the indicesB_idx
usingindex_add_
.
A = torch.arange(1, 8)
B = torch.tensor([2, 3, 2])
B_idx = [idx.repeat(times) for idx, times in zip(torch.arange(len(B)), B)]
B_idx = torch.cat(B_idx) # tensor([0, 0, 1, 1, 1, 2, 2])
A_sum = torch.zeros_like(B)
A_sum.index_add_(dim=0, index=B_idx, source=A)
print(A_sum) # tensor([ 3, 12, 13])
这篇关于基于索引向量的张量约简的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!