如何将张量沿批次连接到 keras 层(不指定批次大小)? [英] How to concatenate a tensor to a keras layer along batch (without specifying batch size)?

查看:31
本文介绍了如何将张量沿批次连接到 keras 层(不指定批次大小)?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想将嵌入层的输出与自定义张量 (myarr/myconst) 连接起来.我可以指定具有固定批量大小的所有内容,如下所示:

I want to concatenate the output from an embedding layer with a custom tensor (myarr / myconst). I can specify everything with a fixed batch size like follows:

import numpy as np
import tensorflow as tf

BATCH_SIZE = 100
myarr = np.ones((10, 5))
myconst = tf.constant(np.tile(myarr, (BATCH_SIZE, 1, 1)))

# Model definition
inputs = tf.keras.layers.Input((10,), batch_size=BATCH_SIZE)
x = tf.keras.layers.Embedding(10, 5)(inputs)
x = tf.keras.layers.Concatenate(axis=1)([x, myconst])
model = tf.keras.models.Model(inputs=inputs, outputs=x)

但是,如果我不指定批量大小并平铺我的数组,即只是以下...

However, if I don't specify batch size and tile my array, i.e. just the following...

myarr = np.ones((10, 5))
myconst = tf.constant(myarr)

# Model definition
inputs = tf.keras.layers.Input((10,))
x = tf.keras.layers.Embedding(10, 5)(inputs)
x = tf.keras.layers.Concatenate(axis=1)([x, myconst])
model = tf.keras.models.Model(inputs=inputs, outputs=x)

... 我收到一个错误,指出无法连接形状 [(None, 10, 5), (10, 5)].有没有办法添加这个 None/batch_size 轴以避免平铺?

... I get an error specifying that shapes [(None, 10, 5), (10, 5)] can't be concatenated. Is there a way to add this None / batch_size axis to avoid tiling?

提前致谢

推荐答案

你想连接一个形状为 (batch, 10, 5) 的 3D 张量 (10, 5) 沿着批次维度.为此,您的常量必须是 3D 的.所以你必须在 (1, 10, 5) 中重塑它并沿着 axis=0 重复它以匹配形状 (batch, 10,5) 并操作连接.

You want to concatenate to a 3D tensor of shape (batch, 10, 5) a constant of shape (10, 5) along the batch dimensionality. To do this your constant must be 3D. So you have to reshape it in (1, 10, 5) and repeat it along the axis=0 in order to match the shape (batch, 10, 5) and operate a concatenation.

我们在 Lambda 层中执行此操作:

We do this inside a Lambda layer:

X = np.random.randint(0,10, (100,10))
Y = np.random.uniform(0,1, (100,20,5))

myarr = np.ones((1, 10, 5)).astype('float32')
myconst = tf.convert_to_tensor(myarr)

def repeat_const(tensor, myconst):
    shapes = tf.shape(tensor)
    return tf.repeat(myconst, shapes[0], axis=0)

inputs = tf.keras.layers.Input((10,))
x = tf.keras.layers.Embedding(10, 5)(inputs)
xx = tf.keras.layers.Lambda(lambda x: repeat_const(x, myconst))(x)
x = tf.keras.layers.Concatenate(axis=1)([x, xx])
model = tf.keras.models.Model(inputs=inputs, outputs=x)
model.compile('adam', 'mse')

model.fit(X, Y, epochs=3)

这篇关于如何将张量沿批次连接到 keras 层(不指定批次大小)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆