如何在Tensorflow中使用多层双向LSTM? [英] How to use multilayered bidirectional LSTM in Tensorflow?

查看:1398
本文介绍了如何在Tensorflow中使用多层双向LSTM?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想知道如何在Tensorflow中使用多层双向LSTM.

I want to know how to use multilayered bidirectional LSTM in Tensorflow.

我已经实现了双向LSTM的内容,但是我想将此模型与添加的多层模型进行比较.

I have already implemented the contents of bidirectional LSTM, but I wanna compare this model with the model added multi-layers.

如何在此部分添加一些代码?

How should I add some code in this part?

x = tf.unstack(tf.transpose(x, perm=[1, 0, 2]))
#print(x[0].get_shape())

# Define lstm cells with tensorflow
# Forward direction cell
lstm_fw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0)
# Backward direction cell
lstm_bw_cell = rnn.BasicLSTMCell(n_hidden, forget_bias=1.0)

# Get lstm cell output
try:
    outputs, _, _ = rnn.static_bidirectional_rnn(lstm_fw_cell, lstm_bw_cell, x,
                                          dtype=tf.float32)
except Exception: # Old TensorFlow version only returns outputs not states
    outputs = rnn.static_bidirectional_rnn(lstm_fw_cell, lstm_bw_cell, x,
                                    dtype=tf.float32)

# Linear activation, using rnn inner loop last output
outputs = tf.stack(outputs, axis=1)
outputs = tf.reshape(outputs, (batch_size*n_steps, n_hidden*2))
outputs = tf.matmul(outputs, weights['out']) + biases['out']
outputs = tf.reshape(outputs, (batch_size, n_steps, n_classes))

推荐答案

您可以使用两种不同的方法来应用多层bilstm模型:

You can use two different approaches to apply multilayer bilstm model:

1)使用上一个bilstm层中的内容作为下一个bilstm的输入.首先,您应该使用长度为 num_layers 的前向和后向单元格创建数组.还有

1) use out of previous bilstm layer as input to the next bilstm. In the beginning you should create the arrays with forward and backward cells of length num_layers. And

for n in range(num_layers):
        cell_fw = cell_forw[n]
        cell_bw = cell_back[n]

        state_fw = cell_fw.zero_state(batch_size, tf.float32)
        state_bw = cell_bw.zero_state(batch_size, tf.float32)

        (output_fw, output_bw), last_state = tf.nn.bidirectional_dynamic_rnn(cell_fw, cell_bw, output,
                                                                             initial_state_fw=state_fw,
                                                                             initial_state_bw=state_bw,
                                                                             scope='BLSTM_'+ str(n),
                                                                             dtype=tf.float32)

        output = tf.concat([output_fw, output_bw], axis=2)

2)还值得一提的另一种方法堆叠式胆囊.

2) Also worth a look at another approach stacked bilstm.

这篇关于如何在Tensorflow中使用多层双向LSTM?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆