重置Keras层中的权重 [英] Reset weights in Keras layer

查看:532
本文介绍了重置Keras层中的权重的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想重置(随机化)我的Keras(深度学习)模型中所有层的权重.原因是我希望能够使用不同的数据拆分多次训练模型,而不必每次都进行(缓慢的)模型重新编译.

I'd like to reset (randomize) the weights of all layers in my Keras (deep learning) model. The reason is that I want to be able to train the model several times with different data splits without having to do the (slow) model recompilation every time.

此讨论的启发,我正在尝试以下代码:

Inspired by this discussion, I'm trying the following code:

# Reset weights
for layer in KModel.layers:
    if hasattr(layer,'init'):
        input_dim = layer.input_shape[1]
        new_weights = layer.init((input_dim, layer.output_dim),name='{}_W'.format(layer.name))
        layer.trainable_weights[0].set_value(new_weights.get_value())

但是,它仅部分起作用.

However, it only partly works.

部分地,因为我检查了一些layer.get_weights()值,但它们似乎有所变化.但是,当我重新开始训练时,成本值比第一次运行时的初始成本值低得多.几乎就像我已经成功重置了一些权重,但不是全部.

Partly, becuase I've inspected some layer.get_weights() values, and they seem to change. But when I restart the training, the cost values are much lower than the initial cost values on the first run. It's almost like I've succeeded resetting some of the weights, but not all of them.

任何关于我要去哪里的提示都将受到感激.谢谢..

Any tips on where I'm going wrong would be deeply appreciated. Thx..

推荐答案

在编译模型之后但在训练模型之前立即保存初始权重:

Save the initial weights right after compiling the model but before training it:

model.save_weights('model.h5')

然后在训练后,通过重新加载初始权重来重置"模型:

and then after training, "reset" the model by reloading the initial weights:

model.load_weights('model.h5')

这为您提供了一个苹果到苹果"模型来比较不同的数据集,并且比重新编译整个模型要快.

This gives you an apples to apples model to compare different data sets and should be quicker than recompiling the entire model.

这篇关于重置Keras层中的权重的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆