Keras:LSTM辍学和LSTM反复辍学之间的区别 [英] Keras: the difference between LSTM dropout and LSTM recurrent dropout

查看:144
本文介绍了Keras:LSTM辍学和LSTM反复辍学之间的区别的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

来自Keras文档:

dropout:在0到1之间浮动. 输入的线性变换.

dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs.

recurrent_dropout:在0到1之间浮动. 递归状态的线性变换.

recurrent_dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the recurrent state.

任何人都可以指向每个辍学下方图像上的什么位置吗?

Can anyone point to where on the image below each dropout happens?

推荐答案

我建议看看

I suggest taking a look at (the first part of) this paper. Regular dropout is applied on the inputs and/or the outputs, meaning the vertical arrows from x_t and to h_t. In your case, if you add it as an argument to your layer, it will mask the inputs; you can add a Dropout layer after your recurrent layer to mask the outputs as well. Recurrent dropout masks (or "drops") the connections between the recurrent units; that would be the horizontal arrows in your picture.

这张照片是从上面的纸上拍摄的.在左侧,输入和输出的定期删除.在右侧,常规辍学加上经常性辍学:

This picture is taken from the paper above. On the left, regular dropout on inputs and outputs. On the right, regular dropout PLUS recurrent dropout:

(在这种情况下,请忽略箭头的颜色;在本文中,他们进一步强调了在每个时间步上都保留相同的退出蒙版)

(Ignore the colour of the arrows in this case; in the paper they are making a further point of keeping the same dropout masks at each timestep)

这篇关于Keras:LSTM辍学和LSTM反复辍学之间的区别的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆