在 tf.nn.dropout 中,keep_prob 参数的作用是什么? [英] In tf.nn.dropout what is the effect of keep_prob argument?

查看:224
本文介绍了在 tf.nn.dropout 中,keep_prob 参数的作用是什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在学习 tf.nn.dropout 命令.文档说,对于概率 keep_prob,输出按 1/keep_prob 放大的输入元素,否则输出 0.缩放是为了使预期总和不变.有人可以解释一下为什么我们采用 1/keep_prob.如果我将其值设置为 0.1.这是否意味着我只保留了 10% 的节点?

I am learning tf.nn.dropout command. Documentation says that with probability keep_prob, outputs the input element scaled up by 1 / keep_prob, otherwise outputs 0. The scaling is so that the expected sum is unchanged. Can someone please explain that why we take 1/keep_prob. And if I set its value 0.1. Does it mean that I am keeping only 10 percent nodes?

推荐答案

缩放因子设置为 1/keep_prob 因为应该在测试或评估时禁用 dropout.查看 tensorflow 层的高级 API:tf.layers.dropout.

The scaling factor is set to 1/keep_prob because the dropout is should be disabled at testing or evaluation. See the high level API on layers of tensorflow: tf.layers.dropout.

在测试或评估过程中,每个单元的激活是不缩放的.训练期间的缩放因子 1/keep_prob 确保预期的激活(keep_prob * (1/keep_prob) = 1)与测试中的相同.可以在此论文中找到更多信息.

During testing or evaluation, the activation of each unit is unscaled. The scaling factor of 1/keep_prob during training makes sure that the expected activation (keep_prob * (1/keep_prob) = 1) is the same as that in testing. More information can be found in this paper.

假设层中有10个单元,将keep_prob设置为0.1,那么10个随机选择的9个单元的激活将设置为0,剩下的将按10倍缩放.我认为更准确的描述是你只保留了 10% 的节点的激活.

Suppose you have 10 units in the layer and set the keep_prob to 0.1, Then the activation of 9 randomly chosen units out of 10 will be set to 0, and the remaining one will be scaled by a factor of 10. I think a more precise description is that you only keep the activation of 10 percent of the nodes.

这篇关于在 tf.nn.dropout 中,keep_prob 参数的作用是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆