参差不齐的张量作为LSTM的输入 [英] Ragged tensors as input for LSTM

查看:48
本文介绍了参差不齐的张量作为LSTM的输入的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

了解参差不齐的张量以及如何将它们与tensorflow一起使用.我的例子

Learning about ragged tensors and how can I use them with tensorflow. My example

xx = tf.ragged.constant([
                        [0.1, 0.2],
                        [0.4, 0.7 , 0.5, 0.6]
                        ])
yy = np.array([[0, 0, 1], [1,0,0]])

mdl = tf.keras.Sequential([
    tf.keras.layers.InputLayer(input_shape=[None], batch_size=2, dtype=tf.float32, ragged=True),
    tf.keras.layers.LSTM(64),  
    tf.keras.layers.Dense(3, activation='softmax')
])

mdl.compile(loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
              optimizer=tf.keras.optimizers.Adam(1e-4),
              metrics=['accuracy'])

mdl.summary()
history = mdl.fit(xx, yy, epochs=10)

错误

Input 0 of layer lstm_152 is incompatible with the layer: expected ndim=3, found ndim=2. Full shape received: [2, None]

我不确定是否可以使用像这样的参差不齐的张量.我发现的所有示例在LSTM之前都有嵌入层,但是我不想创建其他嵌入层.

I am not sure if I can use ragged tensors like this. All examples I found have embedding layer before LSTM, but what I don't want to create additional embedding layer.

推荐答案

我建议使用 Input 图层而不是 InputLayer ,您通常不需要使用InputLayer ,无论如何,您输入的形状和LSTM层输入形状的问题是错误的,在此我进行了一些修改.

I recommend to use Input layer rather than InputLayer, you often not need to use InputLayer, Anyway the probelm that the shape of your input and LSTM layer input shape was wrong , here the modification i have made with some comments.

# xx should be 3d for LSTM
xx = tf.ragged.constant([
                        [[0.1, 0.2]],
                        [[0.4, 0.7 , 0.5, 0.6]]
                        ])

"""
Labels represented as OneHotEncoding so you 
should use CategoricalCrossentropy instade of SparseCategoricalCrossentropy
"""

yy = np.array([[0, 0, 1], [1,0,0]])

# For ragged tensor , get maximum sequence length
max_seq = xx.bounding_shape()[-1]

mdl = tf.keras.Sequential([
    # Input Layer with shape = [Any,  maximum sequence length]                      
    tf.keras.layers.Input(shape=[None, max_seq], batch_size=2, dtype=tf.float32, ragged=True),
    tf.keras.layers.LSTM(64),
    tf.keras.layers.Dense(3, activation='softmax')
])

# CategoricalCrossentropy
mdl.compile(loss=tf.keras.losses.CategoricalCrossentropy(from_logits=True),
              optimizer=tf.keras.optimizers.Adam(1e-4),
              metrics=['accuracy'])

mdl.summary()
history = mdl.fit(xx, yy, epochs=10)

这篇关于参差不齐的张量作为LSTM的输入的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆