Keras中一维CNN中的激活功能错误 [英] Activation function error in a 1D CNN in Keras

查看:68
本文介绍了Keras中一维CNN中的激活功能错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在创建一个模型,以对输入波形是否包含I2C线的SDA的上升沿进行分类.

I'm creating a model to classify if the input waverform contains rising edge of SDA of I2C line.

我的输入有20000个数据点和100个训练数据.

My input has 20000 datapoints and 100 training data.

我最初在此处Keras中找到了有关输入的答案一维CNN:如何正确指定尺寸?

但是,激活功能出现错误:

However, I'm getting an error in the activation function:

ValueError: Error when checking target: expected activation_1 to have 3 dimensions, but got array with shape (100, 1)

我的模特是:

model.add(Conv1D(filters=n_filter,
             kernel_size=input_filter_length,
             strides=1,
             activation='relu',
             input_shape=(20000,1)))
model.add(BatchNormalization())
model.add(MaxPooling1D(pool_size=4, strides=None))

model.add(Dense(1))
model.add(Activation("sigmoid"))

adam = Adam(lr=learning_rate)

model.compile(optimizer= adam, loss='binary_crossentropy', metrics=['accuracy'])

model.fit(train_data, train_label,
      nb_epoch=10,
      batch_size=batch_size, shuffle=True)

score = np.asarray(model.evaluate(test_new_data, test_label, batch_size=batch_size))*100.0

我在这里无法确定问题.关于激活函数为何需要3D张量的原因.

I can't determine the problem in here. On why the activation function expects a 3D tensor.

推荐答案

问题在于,从keras 2.0开始,应用于序列的Dense层将在每个时间步上应用该层-如此给出一个序列,它将产生一个序列.因此,您的Dense实际上生成的是1个元素的向量序列,这会引起问题(因为目标不是序列).

The problem lies in the fact that starting from keras 2.0, a Dense layer applied to a sequence will apply the layer to each time step - so given a sequence it will produce a sequence. So your Dense is actually producing a sequence of 1-element vectors and this causes your problem (as your target is not a sequence).

有几种方法可以将序列简化为向量,然后将Dense应用于向量:

There are several ways on how to reduce a sequence to a vector and then apply a Dense to it:

  1. GlobalPooling:

您可以使用GlobalPooling层,例如GlobalAveragePooling1DGlobalMaxPooling1D,例如:

You may use GlobalPooling layers like GlobalAveragePooling1D or GlobalMaxPooling1D, eg.:

model.add(Conv1D(filters=n_filter,
         kernel_size=input_filter_length,
         strides=1,
         activation='relu',
         input_shape=(20000,1)))
model.add(BatchNormalization())
model.add(GlobalMaxPooling1D(pool_size=4, strides=None))

model.add(Dense(1))
model.add(Activation("sigmoid"))

  • Flattening:

    您可以使用Flatten层将整个序列折叠为单个向量:

    You might colapse the whole sequence to a single vector using Flatten layer:

    model.add(Conv1D(filters=n_filter,
             kernel_size=input_filter_length,
             strides=1,
             activation='relu',
             input_shape=(20000,1)))
    model.add(BatchNormalization())
    model.add(MaxPooling1D(pool_size=4, strides=None))
    model.add(Flatten())
    
    model.add(Dense(1))
    model.add(Activation("sigmoid"))
    

  • RNN后处理:

  • RNN Postprocessing:

    您还可以在序列的顶部添加一个循环层,使其仅返回最后一个输出:

    You could also add a recurrent layer on a top of your sequence and make it to return only the last output:

    model.add(Conv1D(filters=n_filter,
             kernel_size=input_filter_length,
             strides=1,
             activation='relu',
             input_shape=(20000,1)))
    model.add(BatchNormalization())
    model.add(MaxPooling1D(pool_size=4, strides=None))
    model.add(SimpleRNN(10, return_sequences=False))
    
    model.add(Dense(1))
    model.add(Activation("sigmoid"))
    

  • 这篇关于Keras中一维CNN中的激活功能错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

    查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆