如何在Keras中实现RBF激活功能? [英] How to implement RBF activation function in Keras?
问题描述
我正在创建自定义的激活功能,尤其是RBF激活功能:
I am creating a customized activation function, RBF activation function in particular:
from keras import backend as K
from keras.layers import Lambda
l2_norm = lambda a,b: K.sqrt(K.sum(K.pow((a-b),2), axis=0, keepdims=True))
def rbf2(x):
X = #here i need inputs that I receive from previous layer
Y = # here I need weights that I should apply for this layer
l2 = l2_norm(X,Y)
res = K.exp(-1 * gamma * K.pow(l2,2))
return res
函数rbf2
接收上一层作为输入:
The function rbf2
receives the previous layer as input:
#some keras layers
model.add(Dense(84, activation='tanh')) #layer1
model.add(Dense(10, activation = rbf2)) #layer2
我该怎么做才能从layer1
获取输入并从layer2
获得权重以创建定制的激活函数?
What should I do to get the inputs from layer1
and weights from layer2
to create the customized activation function?
我实际上想做的是实现LeNet5神经网络的输出层. LeNet-5的输出层有点特殊,而不是计算输入和权重向量的点积,每个神经元都输出其输入向量和权重向量之间的欧几里得距离的平方.
What I am actually trying to do is, implementing the output layer for LeNet5 neural network. The output layer of LeNet-5 is a bit special, instead of computing the dot product of the inputs and the weight vector, each neuron outputs the square of the Euclidean distance between its input vector and its weight vector.
例如,layer1
具有84个神经元,而layer2
具有10个神经元.通常,为了计算layer2
的10个神经元中每个神经元的输出,我们对layer1
的84个神经元与layer1
和layer2
之间的84个权重进行点积运算.然后,在其上应用softmax
激活功能.
For example, layer1
has 84 neurons and layer2
has 10 neurons. In general cases, for calculating output for each of 10 neurons of layer2
, we do the dot product of 84 neurons of layer1
and 84 weights in between layer1
and layer2
. We then apply softmax
activation function over it.
但是在这里,layer2
的每个神经元都没有进行点积运算,而是输出其输入向量与其权重向量之间的欧几里得距离的平方(我想将其用作我的激活函数).
But here, instead of doing dot product, each neuron of the layer2
outputs the square of the Euclidean distance between its input vector and its weight vector (I want to use this as my activation function).
关于创建RBF激活函数(计算与图层接收的输入和权重之间的欧几里得距离)并在图层中使用它的任何帮助也很有帮助.
Any help on creating RBF activation function (calculating euclidean distance from inputs the layer receives and weights) and using it in the layer is also helpful.
推荐答案
您可以简单地示例用法:
model = Sequential()
model.add(Dense(20, input_shape=(100,)))
model.add(RBFLayer(10, 0.5))
这篇关于如何在Keras中实现RBF激活功能?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!