使用keras退出层后如何知道哪个节点被删除 [英] how to know which node is dropped after using keras dropout layer
问题描述
来自尼克博客很明显,在CNN模型的退出层中,我们基于bernoulli删除了一些节点.但是如何进行验证,即如何检查未选择哪个节点.在DropConnect中,我们保留了一些权重,因此我认为可以借助model.get_weights()
进行验证,但是对于辍学层来说,该如何进行验证.
From nick blog it is clear that in dropout layer of CNN model we drop some nodes on the basis of bernoulli. But how to verify it, i.e. how to check which node is not selected. In DropConnect we leave some weights so I think with the help of model.get_weights()
we can verify, but how in the case of dropout layer.
model = Sequential()
model.add(Conv2D(2, kernel_size=(3, 3),
activation='relu',
input_shape=input_shape))
model.add(Conv2D(4, (3, 3), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Dropout(0.25))
model.add(Flatten())
model.add(Dense(8, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(num_classes, activation='softmax'))
model.compile(loss=keras.losses.binary_crossentropy,
optimizer=keras.optimizers.Adadelta(),
metrics=['accuracy'])
另一个问题是,在 keras 中提到,辍学率应该浮动b/w 0到1.但是对于上述模型,当我采用辍学率= 1.25时,我的模型也正在工作,这是怎么发生的?
Another question is that it is mention in keras that dropout rate should float b/w 0 to 1. But for above model when I take dropout rate = 1.25, then also my model is working, how this happens?
推荐答案
Concerning your second question, if you see Keras code, in the call
method form Dropout
class:
def call(self, inputs, training=None):
if 0. < self.rate < 1.:
noise_shape = self._get_noise_shape(inputs)
def dropped_inputs():
return K.dropout(inputs, self.rate, noise_shape,
seed=self.seed)
return K.in_train_phase(dropped_inputs, inputs,
training=training)
return inputs
这意味着,如果rate
不在0到1之间,它将什么都不做.
This means that if the rate
is not between 0 and 1, it will do nothing.
这篇关于使用keras退出层后如何知道哪个节点被删除的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!