如何克服"ValueError:形状(None,1)和(None,7)不兼容"? [英] how can I overcome "ValueError: Shapes (None, 1) and (None, 7) are incompatible"

查看:1076
本文介绍了如何克服"ValueError:形状(None,1)和(None,7)不兼容"?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是Keras和CNN的新手.我正在完成一项任务,以建立CNN来预测面部表情.我根据分配来构建模型,但是在编译模型时,出现"ValueError:形状(None,1)和(None,7)不兼容"的情况.有人可以帮我解决这个问题吗?

I am new to Keras and CNN. I am working on an assignment to build a CNN for predicting face emotions. I built the model as per the assignment but while compiling the model I get "ValueError: Shapes (None, 1) and (None, 7) are incompatible". Can someone help me how to resolve this?

在下面粘贴我的代码以供参考:

Pasting my code below for reference:

'''

model = Sequential()

model.add(Conv2D(filters = 64, kernel_size = 5,input_shape = (48,48,1)))
model.add(Conv2D(filters=64, kernel_size=5,strides=(1, 1), padding='valid'))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2,2),strides=1, padding='valid'))
model.add(Activation('relu'))

model.add(Conv2D(filters = 128, kernel_size = 5))
model.add(Conv2D(filters = 128, kernel_size=5))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Activation('relu'))

model.add(Conv2D(filters = 256, kernel_size = 5))
model.add(Conv2D(filters = 256, kernel_size=5))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2,2)))
model.add(Activation('relu'))

model.add(Flatten())
model.add(Dense(128))
model.add(BatchNormalization())
model.add(Activation('relu'))
model.add(Dropout(0.25))
model.add(Dense(7,activation='softmax'))

''' 然后尝试编译" '''

''' 'Then tried to compile' '''

model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
history = model.fit(input_array, output_array, batch_size = 64, epochs= 20, validation_split=0.10,)

''' '这给出了错误' ``ValueError:形状(无,1)和(无,7)不兼容'' 我正在为此使用google colab"

''' 'This gives the error' 'ValueError: Shapes (None, 1) and (None, 7) are incompatible' 'I am using google colab for this'

推荐答案

您最有可能使用稀疏编码的标签,例如[0,1,2,3,4,5,6],而不是单编码形式.

You are most likely using your labels sparsely encoded, like [0,1,2,3,4,5,6] instead of a one-hot-encoded form.

您的解决方案之一:

  1. 使用单次热编码形式,即,将每个标签转换为长度== number_of_classes的数组.也就是说,对于0,您将具有[1,0,0,0,0,0,0];对于1,您将具有[0,1,0,0,0,0,0],等等.
  2. 使用sparse_categorical_crossentropy.如果使用此损失功能,则OHE步骤将在后台执行,并且您不再需要处理输入的训练+验证标签.
  1. Use the one-hot-encoded form, i.e. transform each label to an array of length == number_of_classes. That is, for 0 you would have [1,0,0,0,0,0,0] for 1 you would have [0,1,0,0,0,0,0] etc.
  2. Use sparse_categorical_crossentropy. If you use this loss functions, the OHE step is done behind the scenes and you no longer need to process your input training + validation labels.

这篇关于如何克服"ValueError:形状(None,1)和(None,7)不兼容"?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆