如何将张量流对象包装为Keras层? [英] How to wrap a tensorflow object as Keras layer?
问题描述
我想将多层多尺度LSTM实现为Keras层.
它在此处发布,并在tensorflow
I would like to implement Hierarchical Multiscale LSTM as a Keras layer.
It was published here and implemented in tensorflow here.
My understanding is that there's a way to wrap such a tensorflow object in Keras as a layer. I'm not sure how complicated it is but I think it's feasible. Can you help me how to do it?
推荐答案
通常由 keras .engine.topology.layer 并为以下方法提供自定义实现(并将TensorFlow代码放入其中):
This is usually done by implementing a custom Layer. To be more specific, you should inherit from keras.engine.topology.layer and provide a custom implementation for the following methods (and place the TensorFlow code within them):
build(input_shape)
:这是定义权重的地方.此方法必须设置self.built = True
,可以通过调用super([Layer], self).build()
call(x)
:这是层的逻辑所在.除非您希望您的图层支持遮罩,否则您只需要关心第一个 传递给call的参数:输入张量.compute_output_shape(input_shape)
:如果您的图层修改了其输入的形状,则应在此处指定形状 转换逻辑.这允许Keras进行自动造型 推断.
build(input_shape)
: this is where you will define your weights. This method must setself.built = True
, which can be done by callingsuper([Layer], self).build()
call(x)
: this is where the layer's logic lives. Unless you want your layer to support masking, you only have to care about the first argument passed to call: the input tensor.compute_output_shape(input_shape)
: in case your layer modifies the shape of its input, you should specify here the shape transformation logic. This allows Keras to do automatic shape inference.
Since you're trying to implement a recurrent layer, it would also be convenient to inherit directly from keras.legacy.layers.recurrent. In this case, you probably do not need to redefine compute_output_shape(input_shape)
. If your layer needs additional arguments, you can pass them to the __init__
method of your custom layer.
这篇关于如何将张量流对象包装为Keras层?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!