在keras中的两个密集层之间共享权重 [英] Share weights between two dense layers in keras

查看:117
本文介绍了在keras中的两个密集层之间共享权重的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有如下代码.我想做的是在两个密集层中共享相同的权重.

I have a code as follows. What I want to do is to share the same weights in two dense layers.

op1和op2层的方程式如下

The equation for op1 and op2 layer will be like that

op1 = w1y1 + w2y2 + w3y3 + w4y4 + w5y5 + b1

op2 = w1z1 + w2z2 + w3z3 + w4z4 + w5z5 + b1

在这里,w1到w5的权重在op1和op2层输入之间共享,它们分别是(y1到y5)和(z1到z5).

here w1 to w5 weights are shared between op1 and op2 layer inputs which are (y1 to y5) and (z1 to z5) respectively.

ip_shape1 = Input(shape=(5,))
ip_shape2 = Input(shape=(5,))

op1 = Dense(1, activation = "sigmoid", kernel_initializer = "ones")(ip_shape1)
op2 = Dense(1, activation = "sigmoid", kernel_initializer = "ones")(ip_shape2)

merge_layer = concatenate([op1, op2])
predictions = Dense(1, activation='sigmoid')(merge_layer)

model = Model(inputs=[ip_shape1, ip_shape2], outputs=predictions)

谢谢.

推荐答案

这对双方都使用相同的图层. (权衡和偏见是共享的)

This uses the same layer for both sides. (Weighs and bias are shared)

ip_shape1 = Input(shape=(5,))
ip_shape2 = Input(shape=(5,))

dense = Dense(1, activation = "sigmoid", kernel_initializer = "ones")

op1 = dense(ip_shape1)
op2 = dense(ip_shape2)

merge_layer = Concatenate()([op1, op2])
predictions = Dense(1, activation='sigmoid')(merge_layer)

model = Model(inputs=[ip_shape1, ip_shape2], outputs=predictions)

这篇关于在keras中的两个密集层之间共享权重的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆