如何在keras的各层之间共享卷积内核? [英] How to share convolution kernels between layers in keras?

查看:467
本文介绍了如何在keras的各层之间共享卷积内核?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

假设我想比较两个具有深度卷积神经网络的图像.如何在keras中使用相同的内核实现两种不同的途径?

Suppose I want to compare two images with deep convolutional NN. How can I implement two different pathways with the same kernels in keras?

赞:

我需要卷积层1,2和3使用并训练相同的内核.

I need convolutional layers 1,2 and 3 use and train the same kernels.

有可能吗?

我还想将下面的图像连接起来

I was also thinking to concatenate images like below

但是问题是关于如何在第一张图片上实现totopology.

but question is about how to implement tolopology on first picture.

推荐答案

您可以在模型中使用两次相同的层,从而创建 您还可以使用相同的模型两次,使其成为较大模型的子模型:

You could also use the same model twice, making it a submodel of a bigger one:

#have a previously prepared model 
convModel = some model previously prepared

#define two different inputs
input1 = Input((imageX, imageY, channels))
input2 = Input((imageX, imageY, channels))   

#use the model to get two different outputs:
out1 = convModel(input1)
out2 = convModel(input2)

#concatenate the outputs and add the final part of your model: 
out = Concatenate()([out1,out2])
out = Flatten()(out)
out = Dense(...)(out)   
out = Dense(...)(out)   

#create the model taking 2 inputs with one output
model = Model([input1,input2],out)

这篇关于如何在keras的各层之间共享卷积内核?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆