Keras在softmax之前屏蔽零 [英] Keras masking zero before softmax

查看:343
本文介绍了Keras在softmax之前屏蔽零的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

假设我从LSTM层获得以下输出

Suppose that I have the following output from an LSTM layer

[0.         0.         0.         0.         0.01843184 0.01929785 0.         0.         0.         0.         0.         0. ]

我想在此输出上应用softmax,但我想先屏蔽0.

and I want to apply softmax on this output but I want to mask the 0's first.

使用时

mask = Masking(mask_value=0.0)(lstm_hidden)
combined = Activation('softmax')(mask)

它没有用.有什么想法吗?

It didnt work. Any ideas?

更新:隐藏的LSTM输出是(batch_size, 50, 4000)

Update: The output from the LSTM hidden is (batch_size, 50, 4000)

推荐答案

您可以定义自定义激活来实现.这等效于蒙版0.

You can define custom activation to achieve it. This is equivalent to mask 0.

from keras.layers import Activation,Input
import keras.backend as K
from keras.utils.generic_utils import get_custom_objects
import numpy as np
import tensorflow as tf

def custom_activation(x):
    x = K.switch(tf.is_nan(x), K.zeros_like(x), x) # prevent nan values
    x = K.switch(K.equal(K.exp(x),1),K.zeros_like(x),K.exp(x))
    return x/K.sum(x,axis=-1,keepdims=True)

lstm_hidden = Input(shape=(12,))
get_custom_objects().update({'custom_activation': Activation(custom_activation)})
combined = Activation(custom_activation)(lstm_hidden)

x = np.array([[0.,0.,0.,0.,0.01843184,0.01929785,0.,0.,0.,0.,0.,0. ]])
with K.get_session()as sess:
    print(combined.eval(feed_dict={lstm_hidden:x}))

[[0.         0.         0.         0.         0.49978352 0.50021654
  0.         0.         0.         0.         0.         0.        ]]

这篇关于Keras在softmax之前屏蔽零的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆