在Keras中加载权重后添加DropOut [英] Add DropOut after loading the weights in Keras

查看:259
本文介绍了在Keras中加载权重后添加DropOut的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在做转学之王.我要做的是,首先使用大数据集训练模型并保存权重.然后,我通过冻结图层来用我的数据集训练模型.但是我看到有些过拟合.因此,我尝试更改模型的缺失并加载权重,因为数字在变化,而缺失在变化.我发现很难更改辍学率.

I am doing king of transfer learning. What I have done is First train the model with the big datasets and save the weights. Then I train the model with my dataset by freezing the layers. But I see there was some overfitting. So I try to change the dropout of the model and load the weights since the numbers are changing while drop out are changing. I find difficulties to change the dropout.

我的直接问题是,加载权重时是否可以更改模型的辍学?

Directly my question is, Is it possible to change the model's dropout while loading the weights?

我的情况1就是这样

  1. 已定义模型.
  2. 训练模型.
  3. 列表项
  4. 节省重量.
  5. ...

  1. model defined.
  2. train the model.
  3. list item
  4. save weights.
  5. ...

定义模型中其他未更改的辍学

redefine the dropout others are not changed in the model

第二种情况

  1. model1已定义.

  1. model1 defined.

训练模型.

节省权重

将model1权重加载到model1

load model1 weights to model1

....

model2.

尝试使用for循环将model1的权重设置为model 2 用于辍学层.我有一个错误.

try to set the wights of model1 to model 2 using for loop except for the dropout layer. I got an error.

这是我得到的错误.

 File "/home/sathiyakugan/PycharmProjects/internal-apps/apps/support-tools/EscalationApp/LSTM_Attention_IMDB_New_open.py", line 343, in <module>
    NewModel.layers[i].set_weights(layer.get_weights())
  File "/home/sathiyakugan/PycharmProjects/Python/venv/lib/python3.5/site-packages/keras/engine/base_layer.py", line 1062, in set_weights
    str(weights)[:50] + '...')
ValueError: You called `set_weights(weights)` on layer "lstm_5" with a  weight list of length 1, but the layer was expecting 3 weights. Provided weights: [array([[ 0.      ,  0.      ,  0.      , ...,  0....

正确的方法是什么?由于我是Keras的新手,所以我努力走得更远.

What is the right way to go? Since I am new to Keras, I am struggling to go further.

推荐答案

我建议您使用函数model.load_weights("weights_file.h5")加载权重,然后尝试以下操作:

I recommend you to load the weights using the function model.load_weights("weights_file.h5") and then try the following:

for layer in model.layers:
    if hasattr(layer, 'rate'):
        layer.rate = 0.5

由于仅Dropout图层具有属性rate,因此当您找到具有此属性的图层时,可以对其进行更改.在这里,我使用0.5作为辍学概率,您可以输入所需的值.

Since only the Dropout layers have the attribute rate, when you find a layer with this attribute you can change it. Here I use 0.5 as the Dropout probability, you can put the value that you want.

如果您逐层设置权重,则可以在整个层中的for中汇总以上if

if you are setting the weights layer by layer you can agregate the above if in your for throught the layers

重要:此后,您必须再次编译模型:

IMPORTANT: after this you have to compile the model again:

from keras.optimizers import SGD
model.compile(optimizer=SGD(lr=1e-3, momentum=0.9), loss='categorical_crossentropy', metrics=['accuracy'])

同样,此处传递的参数仅出于示例目的,因此请根据您的问题进行相应的更改.

Again, the parameters passed here are just for example purpose, so change them accordingly to your problem.

这篇关于在Keras中加载权重后添加DropOut的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆