如何在 Tensorflow 中关闭 dropout 以进行测试? [英] How to turn off dropout for testing in Tensorflow?

查看:131
本文介绍了如何在 Tensorflow 中关闭 dropout 以进行测试?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

总的来说,我对 Tensorflow 和 ML 还很陌生,所以我在此为一个(可能的)微不足道的问题道歉.

I am fairly new to Tensorflow and ML in general, so I hereby apologize for a (likely) trivial question.

我使用 dropout 技术来提高我的网络的学习率,它似乎工作得很好.然后,我想在一些数据上测试网络,看看它是否像这样:

I use the dropout technique to improve learning rates of my network, and it seems to work just fine. Then, I would like to test the network on some data to see if it works like this:

   def Ask(self, image):
        return self.session.run(self.model, feed_dict = {self.inputPh: image})

显然,由于 dropout 仍然存在,每次都会产生不同的结果.我能想到的一个解决方案是创建两个单独的模型 - 一个用于训练,另一个用于网络的实际使用,但是,这样的解决方案对我来说似乎不切实际.

Obviously, it yields different results each time as the dropout is still in place. One solution I can think of is to create two separate models - one for a training and the other one for an actual later use of the network, however, such a solution seems impractical to me.

解决这个问题的常用方法是什么?

What's the common approach to solving this problem?

推荐答案

最简单的方法是使用 placeholder_with_default:

The easiest way is to change the keep_prob parameter using a placeholder_with_default:

prob = tf.placeholder_with_default(1.0, shape=())
layer = tf.nn.dropout(layer, prob)

这样当你训练时你可以像这样设置参数:

in this way when you train you can set the parameter like this:

sess.run(train_step, feed_dict={prob: 0.5})

并且当您评估时使用默认值 1.0.

and when you evaluate the default value of 1.0 is used.

这篇关于如何在 Tensorflow 中关闭 dropout 以进行测试?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆