Keras模型中的Tensorflow op [英] Tensorflow op in Keras model

查看:133
本文介绍了Keras模型中的Tensorflow op的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在Keras模型中使用张量流op.我以前曾尝试用Lambda层包装它,但我相信这会禁用该层的反向传播.

I'm trying to use a tensorflow op inside a Keras model. I previously tried to wrap it with a Lambda layer but I believe this disables that layers' backpropagation.

更具体地说,我正在尝试使用

More specifically, I'm trying to use the layers from here in a Keras model, without porting it to Keras layers (I hope to deploy to tensorflow later on). I can compile these layers in a shared library form and load these into python. This gives me tensorflow ops and I don't know how to combine this in a Keras model.

一个简单的Keras MNIST模型示例,例如一个Conv2D图层被tf.nn.conv2d op替换的例子,正是我想要的.

A simple example of a Keras MNIST model, where for example one Conv2D layer is replaced by a tf.nn.conv2d op, would be exactly what I'm looking for.

我看过教程,但它似乎与我正在寻找的相反.似乎将Keras层插入张量流图中.我正在寻找完全相反的方法.

I've seen this tutorial but it appears to do the opposite of what I am looking for. It seems to insert Keras layers into a tensorflow graph. I'm looking to do the exact opposite.

最诚挚的问候, 汉斯

推荐答案

大约两个星期过去了,看来我现在可以回答自己的问题了.

Roughly two weeks have passed and it seems I am able to answer my own question now.

如果您使用此装饰器.在撰写本文时,此功能在C ++中尚不可用,而这正是我一直在寻找的功能.一种解决方法是在C ++中定义一个普通的op,并使用提到的装饰器将其包装在python方法中.如果这些具有相应梯度的函数在tensorflow中注册,则反向传播将自动"发生.

It seems like tensorflow can look up gradients if you register them using this decorator. As of writing, this functionality is not (yet) available in C++, which is what I was looking for. A workaround would be to define a normal op in C++ and wrap it in a python method using the mentioned decorator. If these functions with corresponding gradients are registered with tensorflow, backpropagation will happen 'automagically'.

这篇关于Keras模型中的Tensorflow op的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆