反向传播和反向模式 autodiff 之间有什么区别? [英] What is the difference between backpropagation and reverse-mode autodiff?

查看:24
本文介绍了反向传播和反向模式 autodiff 之间有什么区别?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

通过这本本书,我熟悉以下内容:

Going through this book, I am familiar with the following:

对于每个训练实例,反向传播算法首先生成一个预测(前向传递),测量误差,然后通过每个反向层以测量每个的误差贡献连接(反向传递),最后稍微调整连接权重以减少误差.

For each training instance the backpropagation algorithm first makes a prediction (forward pass), measures the error, then goes through each layer in reverse to measure the error contribution from each connection (reverse pass), and finally slightly tweaks the connection weights to reduce the error.

但是,我不确定这与 TensorFlow 的反向模式 autodiff 实现有何不同.

However I am not sure how this differs from the reverse-mode autodiff implementation by TensorFlow.

据我所知,反向模式 autodiff 首先沿正向遍历图形,然后在第二遍中计算输出相对于输入的所有偏导数.这与传播算法非常相似.

As far as I know reverse-mode autodiff first goes through the graph in the forward direction and then in the second pass computes all partial derivatives for the outputs with respect to the inputs. This is very similar to the propagation algorithm.

反向传播与反向模式 autodiff 有何不同?

推荐答案

感谢 David Parks 的回答提供有效的贡献和有用的链接,但是我已经找到了本书作者本人对这个问题的答案,其中可能会提供更简洁的答案:

Thanks to the answer by David Parks for the valid contribution and useful links, however I have found the answer to this question by the author of the book himself, which may provide a more concise answer:

Bakpropagation 是指使用多个反向传播步骤训练人工神经网络的整个过程,每个步骤计算梯度并使用它们执行梯度下降步骤.相比之下,反向模式自动差异只是一种用于有效计算梯度的技术,它恰好被反向传播使用.

Bakpropagation refers to the whole process of training an artificial neural network using multiple backpropagation steps, each of which computes gradients and uses them to perform a Gradient Descent step. In contrast, reverse-mode auto diff is simply a technique used to compute gradients efficiently and it happens to be used by backpropagation.

这篇关于反向传播和反向模式 autodiff 之间有什么区别?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆