如何使用for循环创建Keras多LSTM层? [英] how to create Keras multi LSTM layer using for loop?
问题描述
I'm trying to implement a multi layer LSTM
in Keras using for loop and this tutorial to be able to optimize the number of layers, which is obviously a hyper-parameter. In the tutorial, the author used skopt
for hyper-parameter optimization
. I used Functional API to create my model. For simplicity, I changed input_tensor
's shape to arbitrary values. My model is:
from keras.layers.core import Dense
from keras.layers import LSTM, Input
from keras.models import Model
from keras.optimizers import RMSprop
from keras.initializers import glorot_uniform, glorot_normal, RandomUniform
input_tensor = Input(shape=(10, 20))
def create_model(learning_rate, num_lstm_layers, num_lstm_units, activation):
init = glorot_normal(seed=None)
init1 = RandomUniform(minval=-0.05, maxval=0.05)
x = Input(shape=(10, 20))
for i in range(num_lstm_layers):
name = 'layer_lstm_{0}'.format(i+1)
if( (i==0) and (num_lstm_layers==1) ):
x = LSTM(units=num_lstm_units, dropout=0.2, recurrent_dropout=0.2,
return_sequences=False, kernel_initializer=init,
activation=activation, name=name)(x)
elif(i != (num_lstm_layers-1) ):
x = LSTM(units=num_lstm_units, dropout=0.2, recurrent_dropout=0.2,
return_sequences=True, kernel_initializer=init,
activation=activation, name=name)(x)
else:
x = LSTM(units=num_lstm_units, dropout=0.2, recurrent_dropout=0.2,
return_sequences=False, kernel_initializer=init,
activation=activation, name=name)(x)
x = Dense(1, activation='linear', kernel_initializer= init1)(x)
model = Model(input_tensor, x)
optimizer = RMSprop(lr=learning_rate, rho=0.9, epsilon=None, decay=0.0)
model.compile(loss='mean_squared_error', optimizer=optimizer, metrics=['mse'] )
return model
每当我尝试使模型适合数据时,都会遇到此错误:
Whenever I try to fit the model to the data, I encounter this error :
ValueError:变量layer_lstm_1_14/kernel/的初始化程序来自 在控制流构造(例如循环或条件)中.什么时候 在循环或条件内创建变量,请使用lambda作为 初始化程序.
ValueError: Initializer for variable layer_lstm_1_14/kernel/ is from inside a control-flow construct, such as a loop or conditional. When creating a variable inside a loop or conditional, use a lambda as the initializer.
到目前为止,我知道应该在某个地方添加lambda
函数或keras Lambda layer
.
另外,我在单独的python脚本中测试了该模型,如下所示:
So far, I've known that somewhere I should add a lambda
function or keras Lambda layer
.
Also, I tested the model in a separate python script, like below:
model = create_model(learning_rate=1e-3, num_lstm_layers=3, num_lstm_units=64, activation='linear')
但是它再次给了我这个错误:
But again it gives me this error:
ValueError:图形已断开:无法获取张量的值 层上的Tensor("input_2:0",shape =(?, 10,20),dtype = float32) "input_2".可以访问以下先前的层,而不会出现问题: []
ValueError: Graph disconnected: cannot obtain value for tensor Tensor("input_2:0", shape=(?, 10, 20), dtype=float32) at layer "input_2". The following previous layers were accessed without issue: []
我还尝试创建模型的Sequential
版本.但是遇到了同样的错误.
I Also tried to create Sequential
version of the model. But encountered the same error.
编辑了从if( i==0)
:到if( (i==0) and (num_lstm_layers==1) ):
的if语句,然后按照André的建议进行了更改,便可以使用for循环创建LSTM模型.
Edited if statement from if( i==0)
: to if( (i==0) and (num_lstm_layers==1) ):
By doing so and making the changes André suggested, you are able to create LSTM models using for loop.
推荐答案
我并不担心您的for循环,而是担心输入.我不确定100%,但是认为您应该尝试删除
As I said in the comments, I'm not worried about your for-loop, but rather the input. I'm not 100% sure, but think that you should try to delete
input_tensor = Input(shape=(10, 20))
...位于create_model(...)
函数之前,并如下编辑其中一个的创建:
... which comes before the create_model(...)
function and edit the creation of the one inside as follows:
input_tensor = x = Input(shape=(10, 20))
继续阅读,您说您得到了Graph disconnected: cannot obtain value for tensor
.听起来肯定好像您的输入未连接.我建议所做的更改应连接您的输入和输出(分别为Model(...)
的第一个和第二个参数).
Reading on, you say you get Graph disconnected: cannot obtain value for tensor
. That definitely sounds like you're input isn't connected. The change I suggest should connect your input and output (respectively the first and second argument of Model(...)
).
这篇关于如何使用for循环创建Keras多LSTM层?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!