在RNN&中使用batch normalization正常吗?LSTM? [英] Is it normal to use batch normalization in RNN & LSTM?

查看:151
本文介绍了在RNN&中使用batch normalization正常吗?LSTM?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我知道在常规神经网络中,人们在激活之前使用批归一化,这将减少对良好权重初始化的依赖.我想知道当我使用它时它是否会对 RNN/lstm RNN 做同样的事情.有没有人有这方面的经验?

I know in regular neural nets people use batch norm before activation and it will reduce the reliance on good weight initialization. I wonder if it would do the same to RNN/lstm RNN when i use it. Does anyone have any experience with it?

推荐答案

不,您不能在循环神经网络上使用 Batch Normalization,因为统计数据是按批次计算的,这不考虑网络的循环部分.权重在 RNN 中共享,每个循环循环"的激活响应可能具有完全不同的统计特性.

No, you cannot use Batch Normalization on a recurrent neural network, as the statistics are computed per batch, this does not consider the recurrent part of the network. Weights are shared in an RNN, and the activation response for each "recurrent loop" might have completely different statistical properties.

已经开发了其他考虑这些限制的类似于批量归一化的技术,例如 层归一化.LSTM 层的重新参数化允许使用批标准化,例如 Coijmaans 等人在 Recurrent Batch Normalization 中所述阿尔.2016.

Other techniques similar to Batch Normalization that take these limitations into account have been developed, for example Layer Normalization. There are also reparametrizations of the LSTM layer that allow Batch Normalization to be used, for example as described in Recurrent Batch Normalization by Coijmaans et al. 2016.

这篇关于在RNN&中使用batch normalization正常吗?LSTM?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆