如何从 Keras 模型中删除前 N 层? [英] How to remove first N layers from a Keras Model?
问题描述
我想从预训练的 Keras 模型中删除前 N 层.例如,一个 EfficientNetB0
,它的前 3 层只负责预处理:
I would like to remove the first N layers from the pretrained Keras model. For example, an EfficientNetB0
, whose first 3 layers are responsible only for preprocessing:
import tensorflow as tf
efinet = tf.keras.applications.EfficientNetB0(weights=None, include_top=True)
print(efinet.layers[:3])
# [<tensorflow.python.keras.engine.input_layer.InputLayer at 0x7fa9a870e4d0>,
# <tensorflow.python.keras.layers.preprocessing.image_preprocessing.Rescaling at 0x7fa9a61343d0>,
# <tensorflow.python.keras.layers.preprocessing.normalization.Normalization at 0x7fa9a60d21d0>]
正如M.Innat所述,第一层是输入层
,应该保留或重新附加.我想删除这些层,但像这样的简单方法会引发错误:
As M.Innat mentioned, the first layer is an Input Layer
, which should be either spared or re-attached. I would like to remove those layers, but simple approach like this throws error:
cut_input_model = return tf.keras.Model(
inputs=[efinet.layers[3].input],
outputs=efinet.outputs
)
这将导致:
ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(...)
推荐的方法是什么?
推荐答案
出现Graph disconnected
错误的原因是你没有指定Input
层.但这不是这里的主要问题.有时,使用 Sequential
和 Functional
API 从 keras
模型中移除中间层并不简单.
The reason for getting the Graph disconnected
error is because you don't specify the Input
layer. But that's not the main issue here. Sometimes removing the intermediate layer from the keras
model is not straightforward with Sequential
and Functional
API.
对于顺序来说,它应该比较容易,而在功能模型中,您需要关心多输入块(例如multiply
、add
等).例如:如果您想删除顺序模型中的某些中间层,您可以轻松地采用此解决方案.但是对于功能模型(efficientnet
),你不能因为multi-input internal blocks,你会遇到这个错误:ValueError: A merge layer应该在输入列表上调用
.所以这需要更多的工作 AFAIK,这是一个可能的方法来克服它.
For sequential, it comparatively should be easy whereas, in a functional model, you need to care about multi-input blocks (e.g multiply
, add
etc). For example: if you want to remove some intermediate layer in a sequential model, you can easily adapt this solution. But for the functional model (efficientnet
), you can't because of the multi-input internal blocks and you will encounter this error: ValueError: A merged layer should be called on a list of inputs
. So that needs a bit more work AFAIK, here is a possible approach to overcome it.
在这里,我将针对您的情况展示一个简单的解决方法,但在某些情况下它可能不通用且也不安全.基于这种方法;使用 pop
方法.为什么使用起来可能不安全!.好的,让我们先加载模型.
Here I will show a simple workaround for your case, but it's probably not general and also unsafe in some cases. That based on this approach; using pop
method. Why it can be unsafe to use!. Okay, let's first load the model.
func_model = tf.keras.applications.EfficientNetB0()
for i, l in enumerate(func_model.layers):
print(l.name, l.output_shape)
if i == 8: break
input_19 [(None, 224, 224, 3)]
rescaling_13 (None, 224, 224, 3)
normalization_13 (None, 224, 224, 3)
stem_conv_pad (None, 225, 225, 3)
stem_conv (None, 112, 112, 32)
stem_bn (None, 112, 112, 32)
stem_activation (None, 112, 112, 32)
block1a_dwconv (None, 112, 112, 32)
block1a_bn (None, 112, 112, 32)
接下来,使用.pop
方法:
func_model._layers.pop(1) # remove rescaling
func_model._layers.pop(1) # remove normalization
for i, l in enumerate(func_model.layers):
print(l.name, l.output_shape)
if i == 8: break
input_22 [(None, 224, 224, 3)]
stem_conv_pad (None, 225, 225, 3)
stem_conv (None, 112, 112, 32)
stem_bn (None, 112, 112, 32)
stem_activation (None, 112, 112, 32)
block1a_dwconv (None, 112, 112, 32)
block1a_bn (None, 112, 112, 32)
block1a_activation (None, 112, 112, 32)
block1a_se_squeeze (None, 32)
这篇关于如何从 Keras 模型中删除前 N 层?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!