Keras,如何获得每一层的输出? [英] Keras, How to get the output of each layer?

查看:1300
本文介绍了Keras,如何获得每一层的输出?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经使用CNN训练了二进制分类模型,这是我的代码

I have trained a binary classification model with CNN, and here is my code

model = Sequential()
model.add(Convolution2D(nb_filters, kernel_size[0], kernel_size[1],
                        border_mode='valid',
                        input_shape=input_shape))
model.add(Activation('relu'))
model.add(Convolution2D(nb_filters, kernel_size[0], kernel_size[1]))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=pool_size))
# (16, 16, 32)
model.add(Convolution2D(nb_filters*2, kernel_size[0], kernel_size[1]))
model.add(Activation('relu'))
model.add(Convolution2D(nb_filters*2, kernel_size[0], kernel_size[1]))
model.add(Activation('relu'))
model.add(MaxPooling2D(pool_size=pool_size))
# (8, 8, 64) = (2048)
model.add(Flatten())
model.add(Dense(1024))
model.add(Activation('relu'))
model.add(Dropout(0.5))
model.add(Dense(2))  # define a binary classification problem
model.add(Activation('softmax'))

model.compile(loss='categorical_crossentropy',
              optimizer='adadelta',
              metrics=['accuracy'])
model.fit(x_train, y_train,
          batch_size=batch_size,
          nb_epoch=nb_epoch,
          verbose=1,
          validation_data=(x_test, y_test))

在这里,我想像TensorFlow一样获得每一层的输出,我该怎么做?

And here, I wanna get the output of each layer just like TensorFlow, how can I do that?

推荐答案

您可以使用以下方法轻松获取任何图层的输出:model.layers[index].output

You can easily get the outputs of any layer by using: model.layers[index].output

对于所有图层,请使用此:

For all layers use this:

from keras import backend as K

inp = model.input                                           # input placeholder
outputs = [layer.output for layer in model.layers]          # all layer outputs
functors = [K.function([inp, K.learning_phase()], [out]) for out in outputs]    # evaluation functions

# Testing
test = np.random.random(input_shape)[np.newaxis,...]
layer_outs = [func([test, 1.]) for func in functors]
print layer_outs

注意:要模拟Dropout,请在layer_outs中将learning_phase用作1.,否则使用0.

Note: To simulate Dropout use learning_phase as 1. in layer_outs otherwise use 0.

(基于评论)

K.function创建theano/tensorflow张量函数,该函数随后用于从给定输入的符号图中获取输出.

K.function creates theano/tensorflow tensor functions which is later used to get the output from the symbolic graph given the input.

K.learning_phase()作为输入,因为很多Keras层(如Dropout/Batchnomalization)都依赖它来在训练和测试期间更改行为.

Now K.learning_phase() is required as an input as many Keras layers like Dropout/Batchnomalization depend on it to change behavior during training and test time.

因此,如果您删除代码中的辍学层,则可以简单地使用:

So if you remove the dropout layer in your code you can simply use:

from keras import backend as K

inp = model.input                                           # input placeholder
outputs = [layer.output for layer in model.layers]          # all layer outputs
functors = [K.function([inp], [out]) for out in outputs]    # evaluation functions

# Testing
test = np.random.random(input_shape)[np.newaxis,...]
layer_outs = [func([test]) for func in functors]
print layer_outs

更优化

我刚刚意识到,先前的答案并不是针对每个函数评估进行了优化,因为数据将被传输到CPU-> GPU内存中,并且还需要对低层进行n-n-over的张量计算.

I just realized that the previous answer is not that optimized as for each function evaluation the data will be transferred CPU->GPU memory and also the tensor calculations needs to be done for the lower layers over-n-over.

相反,这是一种更好的方法,因为您不需要多个函数,而只需一个函数即可为您提供所有输出的列表:

Instead this is a much better way as you don't need multiple functions but a single function giving you the list of all outputs:

from keras import backend as K

inp = model.input                                           # input placeholder
outputs = [layer.output for layer in model.layers]          # all layer outputs
functor = K.function([inp, K.learning_phase()], outputs )   # evaluation function

# Testing
test = np.random.random(input_shape)[np.newaxis,...]
layer_outs = functor([test, 1.])
print layer_outs

这篇关于Keras,如何获得每一层的输出?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆