层 lstm_9 的输入 0 与层不兼容:预期 ndim=3,发现 ndim=4.收到完整形状:[无、2、4000、256] [英] Input 0 of layer lstm_9 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: [None, 2, 4000, 256]
问题描述
我尝试使用 RNN 网络创建模型,但我收到:第 lstm_9 层的输入 0 与该层不兼容:预期 ndim=3,发现 ndim=4.收到完整形状:[无、2、4000、256] 错误.
I try to create model with RNN network but I receive : Input 0 of layer lstm_9 is incompatible with the layer: expected ndim=3, found ndim=4. Full shape received: [None, 2, 4000, 256] error.
输入
train_data.shape() = (100,2,4000)
train_labels.shape() =(100,)
labels_values = 0 or 1 (two classes)
模型
input = Input(shape=(2,4000)) # shape from train_data
embedded = Embedding(2, 256)(input)
lstm = LSTM(1024, return_sequences=True)(embedded) # ERROR
dense = Dense(2, activation='softmax')(lstm)
推荐答案
不幸的是,您设计带有嵌入层的 Keras 功能模型的整个概念是错误的.
Your whole concept of designing Keras functional models with embedding layers is wrong, unfortunately.
- 当您使用嵌入层时,它需要二维数据.
Input shape
2D tensor with shape: (batch_size, sequence_length).
Output shape
3D tensor with shape: (batch_size, sequence_length, output_dim).
参考:https://keras.io/layers/embeddings/
它需要词汇表的 ID 或标记序列.这必须是一个整数数组.
It takes a sequence of IDs or tokens for the vocabulary. This must be an integer array.
假设我们的词汇表有 len 36,我们传递给它一个范围为 (0, 36) 的整数数组列表
Let's say our vocabulary has len 36, we pass it a list of integer arrays in range (0, 36)
[1, 34, 32, 23] 有效[0.2, 0.5] 无效
[1, 34, 32, 23] is valid [0.2, 0.5] is not valid
通常,我们使用 Embedding 来表示缩减空间中的向量,因此 output_dim 低于 input_dim,但基于设计也可能相反.
Usually, we use Embedding to represent the vectors in reduced space, so output_dim is lower than input_dim, but the opposite can be true too based on design.
您需要为输入数据指定 input_length.
You need to specify the input_length for the input data.
如果您使用 return_sequences = True
时间维度将被传递到下一个维度,这在您的情况下是不需要的.
If you use return_sequences = True
the temporal dimension will be passed to the next dimension, it's not desired in your case.
你有 (0, 1, 0, 1, 0, 0, ...) 形式的标签,而不是单热编码形式,所以不要使用 softmax,而是使用 1 个单位的 sigmoid在最后的密集.
You have labels in the form (0, 1, 0, 1, 0, 0, ...) and not in one-hot-encoded form, so don't use softmax but sigmoid with 1 unit in the last dense.
这是稍微修正的网络.
from tensorflow.keras.layers import *
from tensorflow.keras.models import *
import numpy as np
train_data = np.random.randint(0,3, (100, 4000))
y_labels = np.random.randint(0,2, (100,))
input_ = Input(shape=(4000)) # shape from train_data
embedded = Embedding(36, 256, input_length = 4000)(input_)
lstm = LSTM(256, return_sequences=False)(embedded) # --> ERROR
dense = Dense(1, activation='softmax')(lstm)
model = Model(input_, dense)
model.summary()
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_6 (InputLayer) [(None, 4000)] 0
_________________________________________________________________
embedding_5 (Embedding) (None, 4000, 256) 9216
_________________________________________________________________
lstm_5 (LSTM) (None, 256) 525312
_________________________________________________________________
dense (Dense) (None, 1) 257
=================================================================
Total params: 534,785
Trainable params: 534,785
Non-trainable params: 0
这篇关于层 lstm_9 的输入 0 与层不兼容:预期 ndim=3,发现 ndim=4.收到完整形状:[无、2、4000、256]的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!