如何在keras中添加可训练的hadamard产品层? [英] How to add a trainable hadamard product layer in keras?

查看:279
本文介绍了如何在keras中添加可训练的hadamard产品层?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在训练样本中引入稀疏性.我的数据矩阵的大小为(例如)NxP,我想将其传递给权重与输入大小相同的一层(keras层).即,可训练权重矩阵W具有N×P的形状.我想对这一层进行输入矩阵的hadamard积(逐元素乘法). W将元素与输入相乘.在这种情况下,如何获得W的可训练层?

I am trying to introduce sparsity in the training samples. My data matrix has a size of (say) NxP and I want to pass it through a layer (keras layer) which has weights of size same as the input size. That is trainable weight matrix W has a shape of NxP. I want to do an hadamard product (element-wise multiplication) of Input matrix to this layer. W multiplied element-wise with input. How to get a trainable layer for W in this case ?

顺便说一句,非常感谢您的快速回复.但是,我想做的hadamard乘积在两个矩阵之间,一个是输入,可以称其为X,而我的X是NxP的形状.我希望hadamard层中的内核的大小与X相同.因此内核也应具有NxP的大小.并通过调用函数实现了两个矩阵的按元素相乘.

By the way, thank you so much for the quick reply. However, the hadamard product I want to do is between two matrices, one is the input, lets call it X and my X is shape of NxP. And I want my kernel in the hadamard layer to be the same size as X. So kernel should have a size of NxP too. And element wise multiplication of two matrices is achived by the call function.

但是,当前的实现只将内核大小设为P.另外,我尝试如下更改构建中内核的形状:

But the current implementation gives the kernel size as P only. Also,I tried changing the shape of the kernel in the build as follows:

self.kernel = self.add_weight(name='kernel',
                                      shape=input_shape,
                                      initializer='uniform',
                                      trainable=True)

但是它给了我下面的错误:

But it gives me the error below:

TypeError:无法将类型的对象转换为Tensor.内容:(无,16).考虑将元素强制转换为受支持的类型.

TypeError: Failed to convert object of type to Tensor. Contents: (None, 16). Consider casting elements to a supported type.

这里P是16,我将在运行时得到N,N与训练样本的数量相似.

Here P is 16 and I will get my N during the runtime and N is similar to the number of training samples.

预先感谢您的帮助.

推荐答案

Take the example of the documentation to create a layer, and in the call function just define it to be x * self.kernel.

这是我的POC:

from keras import backend as K
from keras.engine.topology import Layer
from keras.models import Sequential
from keras.layers import Dense, Activation
import numpy as np
np.random.seed(7)

class Hadamard(Layer):

    def __init__(self, **kwargs):
        super(Hadamard, self).__init__(**kwargs)

    def build(self, input_shape):
        # Create a trainable weight variable for this layer.
        self.kernel = self.add_weight(name='kernel', 
                                      shape=(1,) + input_shape[1:],
                                      initializer='uniform',
                                      trainable=True)
        super(Hadamard, self).build(input_shape)  # Be sure to call this somewhere!

    def call(self, x):
        print(x.shape, self.kernel.shape)
        return x * self.kernel

    def compute_output_shape(self, input_shape):
        print(input_shape)
        return input_shape

N = 10
P = 64

model = Sequential()
model.add(Dense(128, input_shape=(N, P), activation='relu'))
model.add(Dense(64))
model.add(Hadamard())
model.add(Activation('relu'))
model.add(Dense(32))
model.add(Dense(1))

model.compile(loss='mean_squared_error', optimizer='adam')

print(model.summary())

model.fit(np.ones((10, N, P)), np.ones((10, N, 1)))

print(model.predict(np.ones((20, N, P))))

如果需要将其用作第一层,则应包括输入shape参数:

If you need to use it as the first layer you should include the input shape parameter:

N = 10
P = 64

model = Sequential()
model.add(Hadamard(input_shape=(N, P)))

model.compile(loss='mean_squared_error', optimizer='adam')

print(model.summary())

结果是:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
hadamard_1 (Hadamard)       (None, 10, 64)            640       
=================================================================
Total params: 640
Trainable params: 640
Non-trainable params: 0

这篇关于如何在keras中添加可训练的hadamard产品层?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆