Tensorflow 2.0如何在卷积层之间共享参数? [英] Tensorflow 2.0 How make share parameters among convolutional layers?

查看:284
本文介绍了Tensorflow 2.0如何在卷积层之间共享参数?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在Tensorflow 2.0中重新实现多视图CNN(MVCNN) .但是,从我的角度来看,keras图层没有tf.layers那样的选项reuse = True | False.有什么方法可以定义使用新API共享参数的图层?还是我需要以TFv1的方式构建模型?

I am trying to re-implement Multi-View CNN (MVCNN) in Tensorflow 2.0. However, from what I see, keras layers do not have the options reuse=True|False like in tf.layers. Is there any way that I can define my layers which share parameters using the new API? Or I need to build my model in a TFv1 fashion?

非常感谢!

推荐答案

要共享模型的参数,您只需使用相同的模型.这是TensorFlow 2.0中引入的新范例; 在TF 1.xt中,我们使用了一种面向图的方法,在该方法中,我们需要重复使用同一图来共享变量,但是现在我们可以重复使用具有不同输入的同一tf.keras.Model对象.

To share the parameters of a model you just have to use the same model. This is the new paradigm introduced in TensorFlow 2.0; In TF 1.xt we were using a graph-oriented approach, where we need to re-use the same graph to share the variables, but now we can just re-use the same tf.keras.Model object with different inputs.

是带有自己变量的对象.

Is the object that carries its own variables.

使用Keras模型和tf.GradientTape,您可以轻松地训练一个共享变量的模型,如下例所示.

Using a Keras model and tf.GradientTape you can train a model sharing the variables easily as shown in the example below.


# This is your model definition
model = tf.keras.Sequential(...)

#input_1,2 are model different inputs

with tf.GradientTape() as tape:
  a = model(input_1)
  b = model(input_2)
  # you can just copute the loss
  loss = a + b

# Use the tape to compute the gradients of the loss
# w.r.t. the model trainable variables

grads = tape.gradient(loss, model.trainable_varibles)

# Opt in an optimizer object, like tf.optimizers.Adam
# and you can use it to apply the update rule, following the gradient direction

opt.apply_gradients(zip(grads, model.trainable_variables))

这篇关于Tensorflow 2.0如何在卷积层之间共享参数?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆