Keras:将多个LSTM层堆叠在一起 [英] Keras: stacking multiple LSTM layer with

查看:698
本文介绍了Keras:将多个LSTM层堆叠在一起的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下运行良好的网络:

I have the following network which works fine:

output = LSTM(8)(output)
output = Dense(2)(output)

现在对于同一模型,我正在尝试堆叠一些LSTM层如下所示:

Now for the same model, I am trying to stack a few LSTM layer like below:

output = LSTM(8)(output, return_sequences=True)
output = LSTM(8)(output)
output = Dense(2)(output)

但是我得到了以下错误:

But I got the following errors:

TypeError                                 Traceback (most recent call last)
<ipython-input-2-0d0ced2c7417> in <module>()
     39 
     40 output = Concatenate(axis=2)([leftOutput,rightOutput])
---> 41 output = LSTM(8)(output, return_sequences=True)
     42 output = LSTM(8)(output)
     43 output = Dense(2)(output)

/usr/local/lib/python3.4/dist-packages/keras/layers/recurrent.py in __call__(self, inputs, initial_state, constants, **kwargs)
    480 
    481         if initial_state is None and constants is None:
--> 482             return super(RNN, self).__call__(inputs, **kwargs)
    483 
    484         # If any of `initial_state` or `constants` are specified and are Keras

/usr/local/lib/python3.4/dist-packages/keras/engine/topology.py in __call__(self, inputs, **kwargs)
    601 
    602             # Actually call the layer, collecting output(s), mask(s), and shape(s).
--> 603             output = self.call(inputs, **kwargs)
    604             output_mask = self.compute_mask(inputs, previous_mask)
    605 

TypeError: call() got an unexpected keyword argument 'return_sequences'

这令人困惑,因为return_sequences是基于Keras文档的有效参数: https://keras.io/layers/recurrent/#lstm

This is confusing because return_sequences is a valid argument based on the Keras document : https://keras.io/layers/recurrent/#lstm

我在这里做错了什么?谢谢!

What did I do wrong here? Thanks!

推荐答案

问题在于 return_sequences 应该是作为参数传递给图层构造函数-而不是图层调用。将代码更改为:

The problem lied in the fact that return_sequences should be passed as an argument to layer constructor - not layer call. Changing the code to:

output = LSTM(8, return_sequences=True)(output)

解决了问题。

这篇关于Keras:将多个LSTM层堆叠在一起的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆