Keras中的Seq2Seq双向编码器解码器 [英] Seq2Seq Bidirectional Encoder Decoder in Keras

查看:119
本文介绍了Keras中的Seq2Seq双向编码器解码器的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用Keras实现seq2seq编码器-解码器,在编码器上具有双向lstm,如下所示:

I am trying to implement a seq2seq encoder-decoder using Keras, with bidirectional lstm on the encoder as follows:

from keras.layers import LSTM,Bidirectional,Input,Concatenate
from keras.models import Model

n_units = 8
n_input = 1
n_output = 1

# encoder
encoder_inputs = Input(shape=(None, n_input))
encoder = Bidirectional(LSTM(n_units, return_state=True))
encoder_outputs, forward_h, forward_c, backward_h, backward_c = encoder(encoder_inputs)
state_h = Concatenate()([forward_h, backward_h])
state_c = Concatenate()([forward_c, backward_c])
encoder_states = [state_h, state_c]

# decoder
decoder_inputs = Input(shape=(None, n_output))    
decoder_lstm = LSTM(n_units*2, return_sequences=True, return_state=True)
decoder_outputs, _, _ = decoder_lstm(decoder_inputs, initial_state=encoder_states)

这是我在最后一行遇到的以下错误:

Here is the following error I got on the last line:

ValueError: Dimensions must be equal, but are 8 and 16 for 
'lstm_2_1/MatMul_4' (op: 'MatMul') with input shapes: [?,8], [16,16].

有什么想法吗?

推荐答案

尽管错误指向问题中该块的最后一行,但是这是由于推理解码器中隐藏单元的数量错误所致.解决了!

Although the error pointed to the last line of the block in the question, however it was due to the wrong number of hidden units in the inference decoder. Solved!

完整的工作代码:

from keras.layers import LSTM,Bidirectional,Input,Concatenate
from keras.models import Model

n_units = 8
n_input = 1
n_output = 1

# encoder
encoder_inputs = Input(shape=(None, n_input))
encoder = Bidirectional(LSTM(n_units, return_state=True))
encoder_outputs, forward_h, forward_c, backward_h, backward_c = encoder(encoder_inputs)
state_h = Concatenate()([forward_h, backward_h])
state_c = Concatenate()([forward_c, backward_c])
encoder_states = [state_h, state_c]

# decoder
decoder_inputs = Input(shape=(None, n_output))    
decoder_lstm = LSTM(n_units*2, return_sequences=True, return_state=True)
decoder_outputs, _, _ = decoder_lstm(decoder_inputs, initial_state=encoder_states)
decoder_dense = Dense(n_output, activation='softmax')
decoder_outputs = decoder_dense(decoder_outputs)
model = Model([encoder_inputs, decoder_inputs], decoder_outputs)


# define inference encoder
encoder_model = Model(encoder_inputs, encoder_states)
# define inference decoder
decoder_state_input_h = Input(shape=(n_units*2,))
decoder_state_input_c = Input(shape=(n_units*2,))
decoder_states_inputs = [decoder_state_input_h, decoder_state_input_c]
decoder_outputs, state_h, state_c = decoder_lstm(decoder_inputs, initial_state=decoder_states_inputs)
decoder_states = [state_h, state_c]
decoder_outputs = decoder_dense(decoder_outputs)
decoder_model = Model([decoder_inputs] + decoder_states_inputs, [decoder_outputs] + decoder_states)

这篇关于Keras中的Seq2Seq双向编码器解码器的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆