LSTM批次与时间步长 [英] LSTM Batches vs Timesteps
问题描述
我已经按照TensorFlow RNN教程创建了LSTM模型.但是,在此过程中,我对批次"和时间步长"之间的差异(如果有的话)感到困惑,并且希望在阐明此问题方面有所帮助.
I've followed the TensorFlow RNN tutorial to create a LSTM model. However, in the process, I've grown confused as to the difference, if any, between 'batches' and 'timesteps', and I'd appreciate help in clarifying this matter.
教程代码(请参见下文)实质上是基于指定的步骤数来创建批处理":
The tutorial code (see following) essentially creates 'batches' based on a designated number of steps:
with tf.variable_scope("RNN"):
for time_step in range(num_steps):
if time_step > 0: tf.get_variable_scope().reuse_variables()
(cell_output, state) = cell(inputs[:, time_step, :], state)
outputs.append(cell_output)
但是,以下内容似乎可以做到这一点:
However, the following appears to do the same:
for epoch in range(5):
print('----- Epoch', epoch, '-----')
total_loss = 0
for i in range(inputs_cnt // BATCH_SIZE):
inputs_batch = train_inputs[i * BATCH_SIZE: (i + 1) * BATCH_SIZE]
orders_batch = train_orders[i * BATCH_SIZE: (i + 1) * BATCH_SIZE]
feed_dict = {story: inputs_batch, order: orders_batch}
logits, xent, loss = sess.run([...], feed_dict=feed_dict)
推荐答案
假定您正在处理文本,BATCH_SIZE将是您正在并行处理的句子数,而num_steps将是任何句子中的最大单词数.这是您输入LSTM的不同维度.
Assuming you are working with text, BATCH_SIZE would be the number of sentences that you are processing in parallel and num_steps would be the maximum number of words in any sentence. These are different dimensions of your input to the LSTM.
这篇关于LSTM批次与时间步长的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!