无法更改现有Keras模型中的激活 [英] Can't change activations in existing Keras model

查看:59
本文介绍了无法更改现有Keras模型中的激活的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个正常的VGG16模型,具有 relu 激活功能,即

I have a normal VGG16 model with relu activations, i.e.

def VGG_16(weights_path=None):
    model = Sequential()
    model.add(ZeroPadding2D((1, 1),input_shape=(3, 224, 224)))
    model.add(Convolution2D(64, 3, 3, activation='relu'))
    model.add(ZeroPadding2D((1, 1)))
    model.add(Convolution2D(64, 3, 3, activation='relu'))
    model.add(MaxPooling2D((2, 2), strides=(2, 2)))
[...]
    model.add(Flatten())
    model.add(Dense(4096, activation='relu'))
    model.add(Dropout(0.5))
    model.add(Dense(4096, activation='relu'))
    model.add(Dropout(0.5))
    model.add(Dense(1000, activation='softmax'))

    if weights_path:
        model.load_weights(weights_path)

    return model

我要用现有权重实例化它,现在想将所有 relu 激活更改为 softmax (我不知道,没用)

and I'm instantiating it with existing weights and now want to change all relu activations to softmax (not useful, I know)

model = VGG_16('vgg16_weights.h5')
sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)

softmax_act = keras.activations.softmax
for (n, layer) in enumerate(model.layers):
    if 'activation' in layer.get_config() and layer.get_config()['activation'] == 'relu':
        print('replacing #{}: {}, {}'.format(n, layer, layer.activation))
        layer.activation = softmax_act
        print('-> {}'.format(layer.activation))

model.compile(optimizer=sgd, loss='categorical_crossentropy')

注意:更改后, model.compile 被称为,因此我认为该模型仍应可修改.

Note: model.compile is called after the changes, so the model should still be modifiable I guess.

但是,即使调试打印正确地说了

However, even though the debug-prints correctly say

replacing #1: <keras.layers.convolutional.Convolution2D object at 0x7f7d7c497f50>, <function relu at 0x7f7dbe699a28>
-> <function softmax at 0x7f7d7c4972d0>
[...]

实际结果与激活 relu 的模型相同.
为什么Keras不使用更改后的激活功能?

the actual results are identical to the model with relu activations.
Why doesn't Keras use the changed activation function?

推荐答案

您可能要使用apply_modifications

you might want to use apply_modifications

idx_of_layer_to_change = -1
model.layers[idx_of_layer_to_change].activation = activations.softmax
model = utils.apply_modifications(model)

这篇关于无法更改现有Keras模型中的激活的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆