Keras向后传播? [英] Backward propagation in Keras?

查看:116
本文介绍了Keras向后传播?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

谁能告诉我在Keras中反向传播是如何进行的?我读到它在Torch中非常容易,而在Caffe中则非常复杂,但是我找不到与Keras一起做的任何事情.我正在Keras(非常初学者)中实现自己的图层,并且想知道如何进行向后传播.

can anyone tell me how is backpropagation done in Keras? I read that it is really easy in Torch and complex in Caffe, but I can't find anything about doing it with Keras. I am implementing my own layers in Keras (A very beginner) and would like to know how to do the backward propagation.

提前谢谢

推荐答案

您根本就没有. (最新修改:创建自定义训练循环时除外,仅用于高级用途)

You simply don't. (Late edit: except when you are creating custom training loops, only for advanced uses)

Keras自动进行反向传播.除了使用fit方法之一训练模型之外,您完全不需要执行任何操作.

Keras does backpropagation automatically. There's absolutely nothing you need to do for that except for training the model with one of the fit methods.

您只需要注意以下几件事:

You just need to take care of a few things:

  • 要使用反向传播更新的变量(即权重),必须在自定义层中使用build方法中的self.add_weight()方法定义.请参见编写您自己的keras图层.
  • 您正在执行的所有计算都必须使用基本运算符,例如+-*/
  • The vars you want to be updated with backpropagation (that means: the weights), must be defined in the custom layer with the self.add_weight() method inside the build method. See writing your own keras layers.
  • All calculations you're doing must use basic operators such as +, -, *, / or backend functions. By backend, tensorflow/theano/CNTK functions are also supported.

这就是使自动反向传播正常工作所需要的.

This is all you need to have the automatic backpropagation working properly.

如果您的图层没有可训练的权重,则不需要自定义图层,而是创建Lambda图层(仅用于计算,没有可训练的权重).

If your layers don't have trainable weights, you don't need custom layers, create Lambda layers instead (only calculations, no trainable weights).

这篇关于Keras向后传播?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆