如何关闭/打开LSTM层? [英] How to switch Off/On an LSTM layer?

查看:107
本文介绍了如何关闭/打开LSTM层?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在寻找一种访问LSTM层的方法,以使层的添加和减少是事件驱动的.因此,当有功能触发器时,可以添加或减去图层. 例如(假设): 如果a = 2,则添加一个LSTM层;如果a = 3,则删除一个LSTM层.

I am looking for a way to access the LSTM layer such that the addition and subtraction of a layer are event-driven. So the Layer can be added or subtracted when there is a function trigger. For Example (hypothetically): Add an LSTM layer if a = 2 and remove an LSTM layer if a = 3.

这里a = 2和a = 3应该是一个python函数,该函数返回应添加或删除LSTM层的特定值.我想在图层上添加一个switch函数,以便可以基于python函数打开或关闭它.

Here a = 2 and a= 3 is supposed to be a python function which returns specific value based on which the LSTM layer should be added or removed. I want to add a switch function to the layer so that it can be switched on or off based on the python function.

有可能吗?

当前,我需要对所需的层进行硬编码.例如:

Currently, I need to hard code the layer needed. For eg:

# Initialising the RNN 
regressor = Sequential()

# Adding the first LSTM layer and some Dropout regularization 
regressor.add(LSTM(units = 60, return_sequences = True, input_shape = 
(X_train.shape[1], X_train.shape[2])))
#regressor.add(Dropout(0.1))

# Adding the 2nd LSTM layer and some Dropout regularization 
regressor.add(LSTM(units = 60, return_sequences = True))
regressor.add(Dropout(0.1))

我的目标是在运行时同时添加和减去这些层. 任何帮助表示赞赏!

My goal is to both add and subtract these layers at runtime. Any help is appreciated!!

推荐答案

我找到了答案,并在其他人正在寻找解决方案的情况下发布. 这可以通过使用冻结Keras图层功能来完成.基本上,您需要将boolean可训练参数传递给图层构造函数,以将其设置为不可训练.

I found the answer and posting in case anyone else is looking for the solution. This can be done by using freeze Keras layer functionality. Basically, you need to pass the boolean trainable argument to the layer constructor to set it as non-trainable.

例如:

 frozen_layer = Dense(32, trainable=False)

此外,如果要在实例化后将图层的可训练属性设置为True或False,请设置为true或False.通过在修改可训练属性后在模型上调用compile().例如:

Additionally, in case you want to set the trainable property of a layer to True or False after instantiation. By calling compile() on your model after modifying the trainable property. Eg:

    x = Input(shape=(32,))
    layer = Dense(32)
    layer.trainable = False
    y = layer(x)

    frozen_model = Model(x, y)
    # the weights of layer will not be updated during training for below model
    frozen_model.compile(optimizer='rmsprop', loss='mse')

    layer.trainable = True
    trainable_model = Model(x, y)
    # the weights of the layer will be updated during training 
    # (which will also affect the above model since it uses the same layer instance)
    trainable_model.compile(optimizer='rmsprop', loss='mse')

    frozen_model.fit(data, labels)  # this does NOT update the weights of layer
    trainable_model.fit(data, labels)  # this updates the weights of layer

希望这会有所帮助!

这篇关于如何关闭/打开LSTM层?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆