Keras:LSTM dropout 和 LSTM recurrent dropout 的区别 [英] Keras: the difference between LSTM dropout and LSTM recurrent dropout

查看:33
本文介绍了Keras:LSTM dropout 和 LSTM recurrent dropout 的区别的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

来自 Keras 文档:

dropout:在 0 和 1 之间浮动.输入的线性变换.

recurrent_dropout:在 0 和 1 之间浮动.drop 用于循环状态的线性变换.

谁能指出每个辍学发生在图片下方的哪个位置?

解决方案

我建议看一下(第一部分)

(在这种情况下忽略箭头的颜色;在论文中,他们进一步强调在每个时间步保持相同的 dropout 掩码)

From the Keras documentation:

dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the inputs.

recurrent_dropout: Float between 0 and 1. Fraction of the units to drop for the linear transformation of the recurrent state.

Can anyone point to where on the image below each dropout happens?

解决方案

I suggest taking a look at (the first part of) this paper. Regular dropout is applied on the inputs and/or the outputs, meaning the vertical arrows from x_t and to h_t. In your case, if you add it as an argument to your layer, it will mask the inputs; you can add a Dropout layer after your recurrent layer to mask the outputs as well. Recurrent dropout masks (or "drops") the connections between the recurrent units; that would be the horizontal arrows in your picture.

This picture is taken from the paper above. On the left, regular dropout on inputs and outputs. On the right, regular dropout PLUS recurrent dropout:

(Ignore the colour of the arrows in this case; in the paper they are making a further point of keeping the same dropout masks at each timestep)

这篇关于Keras:LSTM dropout 和 LSTM recurrent dropout 的区别的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆