标记和培训Keras中的自定义层 [英] Flag for training and test for custom layer in Keras

查看:55
本文介绍了标记和培训Keras中的自定义层的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想创建一个自定义的keras层,该层在训练过程中执行其他操作,以进行验证或测试.

I want to create a custom keras layer which does something during training and something else for validation or testing.

from tensorflow import keras
K = keras.backend
from keras.layers import Layer
import tensorflow as tf

class MyCustomLayer(Layer):

    def __init__(self, ratio=0.5, **kwargs):
        self.ratio = ratio
        super(MyCustomLayer, self).__init__(**kwargs)

    @tf.function
    def call(self, x, is_training=None):

        is_training = K.learning_phase()
        tf.print("training: ", is_training)
        if is_training is 1 or is_training is True:

            xs = x * 4
            return xs
        else:
            xs = x*0
            return xs

model = Sequential()
model.add(Dense(16, input_dim=input_dim))
model.add(MyCustomLayer(0.5))
model.add(ReLU())
model.add(Dense(32, activation='relu'))
model.add(Dense(16, activation='relu'))
model.add(Dense(output_dim, activation='softmax', kernel_regularizer=l2(0.01)))
model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])


model.fit(X_train, y_train, validation_split=0.05, epochs=5)

在输出中,我总是得到:

In the output I always get:

training:  0
training:  0
training:  0
training:  0
training:  0
training:  0
training:  0
training:  0

有人知道如何解决此问题吗?

Does anyone knows how to fix this?

推荐答案

此处存在一些问题和误解.首先,您要在 keras tf.keras 导入之间混合导入,您只能使用其中一个.其次,用于 call 的参数称为 training ,而不是 is_training .

There are some issues and misconceptions here. First you are mixing imports between keras and tf.keras imports, you should use only one of them. Second the parameter for call is called training, not is_training.

我认为问题在于 tf.print 并没有真正打印 training 变量的值作为其张量流符号变量,并且可能会间接更改值.还有其他方法可以检查图层在推理和训练过程中的行为是否不同,例如:

I think the issue is that tf.print does not really print the value of the training variable as its a tensorflow symbolic variable and might change value indirectly. There are other ways to check if the layer behaves differently during inference and training, for example:

class MyCustomLayer(Layer):

def __init__(self, ratio=0.5, **kwargs):
    super(MyCustomLayer, self).__init__(**kwargs)

def call(self, inputs, training=None):

    train_x = inputs * 4
    test_x = inputs * 0

    return K.in_train_phase(train_x,
                            test_x,
                            training=training)

然后使用此模型:

model = Sequential()
model.add(Dense(1, input_dim=10))
model.add(MyCustomLayer(0.5))
model.compile(loss='mse', optimizer='adam')

并创建一个明确接收 K.learning_phase()变量的函数的实例:

And making an instance of a function that explictly receives the K.learning_phase() variable:

fun = K.function([model.input, K.learning_phase()], [model.output])

如果将 Klearning_phase()设置为1或0进行调用,则会看到不同的输出:

If you call it with Klearning_phase() set to 1 or 0 you do see different outputs:

d = np.random.random(size=(2,10))
print(fun([d, 1]))
print(fun([d, 0]))

结果:

[array([[4.1759257], [3.9988194]], dtype=float32)]
[array([[0.], [0.]], dtype=float32)]

这表明该层在训练和推理/测试过程中具有不同的行为.

And this indicates that the layer has differen behavior during training and inference/testing.

这篇关于标记和培训Keras中的自定义层的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆