如何排除层权重并仅保留我想要的权重并使用 keras 预测模型? [英] How to excluded layers weight and only keep the weights I want and predict model by using keras?

查看:38
本文介绍了如何排除层权重并仅保留我想要的权重并使用 keras 预测模型?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想提取一些选定层的权重并将它们保存为一个名为 encoding_weight.h5 的 Hdf5 文件,然后预测模型以获得输出.

I want to extract some chosen layers' weight and save them as an Hdf5 file called encoded_weight.h5, then predict model to get the output.

原始模型包含我不需要的所有权重.

the original model have all the weights included those I don't need.

model =  Autoencoder(input_shape=x_train.shape[1:])  #this is the original model
model.summary()
layer_name_list = ['dense2048','batch2048','act2048',
                   'dense1024','batch1024','act1024',
                   'dense512','batch512','act512']

layer_dict = dict([(layer.name, layer) for layer in model.layers])
for i in (layer_name_list):
    layer_name = i
    layer_output = layer_dict[layer_name].get_weights()

上面的代码可以获得我想要的权重作为数组列表,但我不知道如何将它们保存为encoded_weight.h5";这样我就可以在下面的代码中使用它来预测原始模型.

The code above could get the weights I want as array lists, but I don't know how to save them as "encoded_weight.h5" so that I can use it in the code below to predict the original model.

model.load_weights(‘encoded_weight.h5’, by_name=True) 
model.compile(optimizer = Adam(), loss = 'mean_squared_error' , metrics = ['mae'])
z_train = model.predict(x= x_train_z,verbose=2)

推荐答案

您可以使用 tensorflow 的 save_weights 方法保存模型的权重.

You can save weights of your model using tensorflow's save_weights method.

model.save_weights(
    'encoded_weight.h5', overwrite=True, save_format=None, options=None
)

您可以将这些权重加载为

You can load these weights as

model.load_weights('encoded_weight.h5')

如果您想访问各个层的各个权重.你可以这样做.

If you want to access individual weights of individual layers. You can do that.

代码:

# A recursive function to get path of dataset element inside the 'encoded_weight.h5'

def traverse_datasets(hdf_file):

    def h5py_dataset_iterator(g, prefix=''):
        for key in g.keys():
            item = g[key]
            path = f'{prefix}/{key}'
            if isinstance(item, h5py.Dataset): # test for dataset
                yield (path, item)
            elif isinstance(item, h5py.Group): # test for group (go down)
                yield from h5py_dataset_iterator(item, path)

    for path, _ in h5py_dataset_iterator(hdf_file):
        yield path

import h5py
filename = "encoded_weight.h5"

hf = h5py.File(filename, "r")

for dset in traverse_datasets(hf):
    print('Path:', dset)
    print(hf[dset])
#     print(np.array(hf[dset]))   # Contains you array
    print('-----------------------')

输出:

Path: /conv1d/conv1d/bias:0
<HDF5 dataset "bias:0": shape (64,), type "<f4">
-----------------------
Path: /conv1d/conv1d/kernel:0
<HDF5 dataset "kernel:0": shape (3, 1, 64), type "<f4">
-----------------------
Path: /dense/dense/bias:0
<HDF5 dataset "bias:0": shape (128,), type "<f4">
-----------------------
Path: /dense/dense/kernel:0
<HDF5 dataset "kernel:0": shape (3712, 128), type "<f4">
-----------------------
Path: /dense_1/dense_1/bias:0
<HDF5 dataset "bias:0": shape (5,), type "<f4">
-----------------------
Path: /dense_1/dense_1/kernel:0
<HDF5 dataset "kernel:0": shape (128, 5), type "<f4">
-----------------------

使用它,您可以使用 set_weights 方法更新各个层的权重.

Using this you can update the weights of individual layers using set_weights methods.

我的模型层:

model.layers

输出:

[<tensorflow.python.keras.layers.convolutional.Conv1D at 0x209a3b41e08>,
 <tensorflow.python.keras.layers.pooling.MaxPooling1D at 0x209a9e40cc8>,
 <tensorflow.python.keras.layers.core.Flatten at 0x209a9e49708>,
 <tensorflow.python.keras.layers.core.Dense at 0x209a9e49588>,
 <tensorflow.python.keras.layers.core.Dropout at 0x209a9e4fa48>,
 <tensorflow.python.keras.layers.core.Dense at 0x209a9e56f08>]

更新 conv1d 层的权重.

Updating weights of conv1d layer.

代码:

w = [tf.constant(hf['/conv1d/conv1d/kernel:0']),tf.constant(hf['/conv1d/conv1d/bias:0'])]
model.layers[0].set_weights(w)

这篇关于如何排除层权重并仅保留我想要的权重并使用 keras 预测模型?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆