Keras:LSTM辍学和LSTM反复辍学之间的区别 [英] Keras: the difference between LSTM dropout and LSTM recurrent dropout
问题描述
来自Keras文档:
dropout:在0到1之间浮动. 输入的线性变换.
dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs.
recurrent_dropout:在0到1之间浮动. 递归状态的线性变换.
recurrent_dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the recurrent state.
任何人都可以指向每个辍学下方图像上的什么位置吗?
Can anyone point to where on the image below each dropout happens?
推荐答案
我建议看看本文.定期在输入和/或输出上施加压降,这意味着从x_t
到h_t
的垂直箭头.在您的情况下,如果将其添加为图层的参数,它将掩盖输入.您可以在循环图层之后添加一个Dropout图层,以屏蔽输出.递归退出屏蔽(或丢弃")递归单元之间的连接;那就是图片中的水平箭头.
I suggest taking a look at (the first part of) this paper. Regular dropout is applied on the inputs and/or the outputs, meaning the vertical arrows from x_t
and to h_t
. In your case, if you add it as an argument to your layer, it will mask the inputs; you can add a Dropout layer after your recurrent layer to mask the outputs as well. Recurrent dropout masks (or "drops") the connections between the recurrent units; that would be the horizontal arrows in your picture.
这张照片是从上面的纸上拍摄的.在左侧,输入和输出的定期删除.在右侧,常规辍学加上经常性辍学:
This picture is taken from the paper above. On the left, regular dropout on inputs and outputs. On the right, regular dropout PLUS recurrent dropout:
(在这种情况下,请忽略箭头的颜色;在本文中,他们进一步强调了在每个时间步上都保留相同的退出蒙版)
(Ignore the colour of the arrows in this case; in the paper they are making a further point of keeping the same dropout masks at each timestep)
这篇关于Keras:LSTM辍学和LSTM反复辍学之间的区别的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!