如何在Tensorflow或Keras中实现K-Max池化? [英] How to implement K-Max pooling in Tensorflow or Keras?
问题描述
首先,我知道我应该使用top_k,但是k-max池很难实现(在TF中实现)的原因是必须保留顺序.
First off I know that I should use top_k but what makes k-max pooling hard (to implement in TF) is that it has to preserve the order.
我到目前为止有什么:
import tensorflow as tf
from tensorflow.contrib.framework import sort
sess = tf.Session()
a = tf.convert_to_tensor([[[5, 1, 10, 2], [3, 11, 2, 6]]])
b = sort(tf.nn.top_k(a, k=2)[1])
print(tf.gather(a, b, axis=-1).eval(session=sess))
已经很近了,但是还没到
it's close but not there yet
我得到的是:[[[[[[5,10],[1,2]]],[[[3,2],[11,6]]]]]
what I get: [[[[[ 5, 10], [ 1, 2]]], [[[ 3, 2], [11, 6]]]]]
我想要什么:[[[5,10],[11,6]]
what I want: [[[5, 10], [11, 6]]]
我几乎百分百地确定是否需要collect_nd,但我无法弄清楚,我也是pytorch用户,在那真的很容易
I am almost hundred percent sure that gather_nd is required but I can't figure that out, also I am a pytorch user and it's really easy there
import torch
a = torch.LongTensor([[[5, 1, 10, 2], [3, 11, 2, 6]]])
b = a.topk(2, dim = -1)[1].sort(dim = -1)[0]
print(a.gather(-1, b))
哦,还有我发现的每个代码都不是一个保存命令的命令(从语义上来说是错误的)
Oh and also every code that I found was not an order preserving one(which is semantically wrong)
推荐答案
奇怪...这应该超级简单,但我们找不到合适的解决方案...
Weird... this should be super easy and yet we can't find a ready solution...
尝试一下:
sess = tf.Session()
k = 2
a = tf.convert_to_tensor([[[5, 1, 10, 2], [3, 11, 2, 6]]])
b = tf.nn.top_k(a, k=k, sorted=True)[1]
b = sort(b)
flatA = tf.reshape(a,(-1,))
shapeA = tf.shape(a)
lenA = tf.shape(flatA)[0]
kShape = tf.concat([shapeA[:-1],tf.constant([k])], axis=-1)
indices = tf.range(lenA)
indices = tf.reshape(indices,shapeA)
toSum = tf.expand_dims(tf.gather(indices,0,axis=-1), axis=-1)
b += toSum
b = tf.reshape(b,(-1,))
gat = tf.gather(flatA, b)
gat = tf.reshape(gat, kShape)
print(gat.eval(session=sess))
这篇关于如何在Tensorflow或Keras中实现K-Max池化?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!