使用双向包装器时,如何获得 LSTM 层中的最终隐藏状态和序列 [英] how could i get both the final hidden state and sequence in a LSTM layer when using a bidirectional wrapper

查看:8
本文介绍了使用双向包装器时,如何获得 LSTM 层中的最终隐藏状态和序列的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已按照 https://machinelearningmastery.com/return-sequences-and-return-states-for-lstms-in-keras/但是当涉及到双向 lstm 时,我尝试了这个

i have followed the steps in https://machinelearningmastery.com/return-sequences-and-return-states-for-lstms-in-keras/ But when it comes to the Bidirectional lstm, i tried this

lstm, state_h, state_c = Bidirectional(LSTM(128, return_sequences=True, return_state= True))(input)

但它不起作用.

在使用双向包装器时,是否有某种方法可以同时获得 LSTM 层中的最终隐藏状态和序列

is there some approach to get both the final hidden state and sequence in a LSTM layer when using a bidirectional wrapper

推荐答案

调用Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)返回5张量:>

The call Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input) returns 5 tensors:

  1. 整个隐藏状态序列,默认情况下它将是前向和后向状态的串联.
  2. 前向LSTM的最后一个隐藏状态h
  3. 前向LSTM的最后一个单元状态c
  4. 后向LSTM的最后一个隐藏状态h
  5. 后向LSTM的最后一个单元状态c

您发布的行会引发错误,因为您只想将返回值解包为三个变量(lstm、state_h、state_c).

The line you've posted would raise an error since you want to unpack the returned value into just three variables (lstm, state_h, state_c).

要纠正它,只需将返回值解包为 5 个变量即可.如果要合并状态,可以使用 Concatenate 层连接前向和后向状态.

To correct it, simply unpack the returned value into 5 variables. If you want to merge the states, you can concatenate the forward and backward states with Concatenate layers.

lstm, forward_h, forward_c, backward_h, backward_c = Bidirectional(LSTM(128, return_sequences=True, return_state=True))(input)
state_h = Concatenate()([forward_h, backward_h])
state_c = Concatenate()([forward_c, backward_c])

这篇关于使用双向包装器时,如何获得 LSTM 层中的最终隐藏状态和序列的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆