限制Keras层中参数的总和 [英] Constraint on the sum of parameters in Keras Layer
问题描述
我想在图层的参数上添加自定义约束.我编写了一个具有两个可训练参数a和b s.t的自定义激活层: activation_fct = a * fct()+ b * fct()
.我需要使参数的总和(a + b)等于1,但是我不知道如何编写这样的约束.你能给我一些建议吗?
I want to add custom constraints on the parameters of a layer.
I write a custom activation layer with two trainable parameters a and b s.t:
activation_fct = a*fct() + b*fct()
.
I need to have the sum of the parameters (a+b) equal to 1 but I don't know how to write such a constraint.
Can you give me some advices ?
先谢谢了.
推荐答案
您可以使用一个权重而不是两个权重,并使用以下自定义约束:
You can have a single weight instead of two, and use this custom constraint:
import keras.backend as K
class Between_0_1(keras.constraints.Constraint):
def __call__(self, w):
return K.clip(w, 0, 1)
然后在构建权重时,仅构建a并使用约束.
Then when building the weights, build only a and use the constraints.
def build(self, input_shape):
self.a = self.add_weight(name='weight_a',
shape=(1,),
initializer='uniform',
constraint = Between_0_1(),
trainable=True)
#if you want to start as 0.5
K.set_value(self.a, [0.5])
self.built = True
在 call
中, b = 1-a
:
def call(self, inputs, **kwargs):
#do stuff
....
return (self.a * something) + ((1-self.a)*another_thing)
您也可以尝试@MatusDubrava softmax
方法,但是在这种情况下,您的权重必须具有形状(2,)
,并且没有约束:
You can alsto try @MatusDubrava softmax
approach, but in this case your weights need to have shape (2,)
, and no constraint:
def build(self, input_shape):
self.w = self.add_weight(name='weights',
shape=(2,),
initializer='zeros',
trainable=True)
self.build = True
def call(self, inputs, **kwargs):
w = K.softmax(self.w)
#do stuff
....
return (w[0] * something ) + (w[1] * another_thing)
这篇关于限制Keras层中参数的总和的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!