批量归一化(BN)后使用泄漏ReLu是否有用 [英] Is use of Leaky ReLu after Batch Normalization (BN) is useful
问题描述
在我的CNN网络中,我正在使用BN
层之后的Leaky ReLu
. Leaky ReLu通过为负值添加f(y)= ay解决垂死的ReLu问题. BN引入了零均值和单位方差.那么BN是否会删除负数部分,即这会将所有值转换为0到1的小数位数吗?基于此,将选择Leaky ReLu.因为如果BN移除负部分,则使用泄漏的relu将与relu相同.我正在使用keras.
In my CNN network i am using i am using Leaky ReLu
after BN
layer. Leaky ReLu solves dying ReLu problem by adding f(y)=ay for negative values. BN introduces zero mean and unit variance. So is BN remove negative part or not i.e. is this converts all valus into 0 to 1 scale? Based on this only selection of Leaky ReLu will be done. Because if BN remove negative part then use of Leaky relu will be same as relu. I am using keras.
推荐答案
BN层尝试通过减去对输入的期望,将其输出归零.因此,我们可以预期其某些输出值为负.
The BN layer tries to zero-mean its output by subtracting an expectation over inputs. So we can expect some of its output values to be negative.
因此,在BN层之后的LeakyReLU仍将接收负值.
So the LeakyReLU following the BN layer will still receive negative values.
这篇关于批量归一化(BN)后使用泄漏ReLu是否有用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!