keras中的自定义线性变换 [英] Custom linear transformation in keras

查看:423
本文介绍了keras中的自定义线性变换的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在keras中构建一个自定义层,以便对最后一层的输出进行线性变换. 例如,我从最后一层获得了输出X,我的新层将输出X.dot(W)+ b.

I want to build a customized layer in keras to do a linear transformation on the output of last layer. For example, I got an output X from last layer, my new layer will output X.dot(W)+b.

W的形状为(49,10),X的形状为(64,49),b的形状为(10,)

The shape of W is (49,10), and the shape of X should be (64,49), the shape of b is (10,)

但是,X的形状是(?, 7, 7, 64),当我尝试重塑形状时,它会变成shape=(64, ?).问号是什么意思?您能告诉我在最后一层的输出上进行线性变换的正确方法吗?

However, the shape of X is (?, 7, 7, 64), when I am trying to reshape it, it becomes shape=(64, ?). What is the meaning of question mark? Could you tell me a proper way to do linear transformation on the output of last layer?

推荐答案

问号通常表示批次大小,这对模型体系结构没有影响.

The question mark generally represents the batch size, which has no effect on the model architecture.

您应该能够用keras.layers.Reshape((64,49))(X)重塑X的形状.

You should be able to reshape your X with keras.layers.Reshape((64,49))(X).

您可以在 Lambda 层中包装任意张量流操作,例如tf.matmul在Keras模型中包含自定义图层.可行的最小工作示例:

You can wrap arbitrary tensorflow operations such as tf.matmul in a Lambda layer to include custom layers in your Keras model. Minimal working example that does the trick:

import tensorflow as tf
from keras.layers import Dense, Lambda, Input
from keras.models import Model

W = tf.random_normal(shape=(128,20))
b = tf.random_normal(shape=(20,))

inp = Input(shape=(10,))
x = Dense(128)(inp)
y = Lambda(lambda x: tf.matmul(x, W) + b)(x)
model = Model(inp, y)

这篇关于keras中的自定义线性变换的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆