如何使用Keras计算预测不确定性? [英] How to calculate prediction uncertainty using Keras?

查看:249
本文介绍了如何使用Keras计算预测不确定性?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想计算NN模型的确定性/置信度(请参阅我的深刻见解模型不知道)-当NN告诉我图像代表"8"时,我想知道它的确定性.我的模型是99%肯定是"8"还是51%是"8",但也可能是"6"?一些数字是非常模糊的,我想知道模型只是在翻转硬币"上显示哪些图像.

I would like to calculate NN model certainty/confidence (see What my deep model doesn't know) - when NN tells me an image represents "8", I would like to know how certain it is. Is my model 99% certain it is "8" or is it 51% it is "8", but it could also be "6"? Some digits are quite ambiguous and I would like to know for which images the model is just "flipping a coin".

我已经找到了一些有关此的理论著作,但是我很难将其放入代码中.如果我理解正确,我应该多次评估测试图像,同时杀死"不同的神经元(使用辍学),然后...?

I have found some theoretical writings about this but I have trouble putting this in code. If I understand correctly, I should evaluate a testing image multiple times while "killing off" different neurons (using dropout) and then...?

在MNIST数据集上,我正在运行以下模型:

Working on MNIST dataset, I am running the following model:

from keras.models import Sequential
from keras.layers import Dense, Activation, Conv2D, Flatten, Dropout

model = Sequential()
model.add(Conv2D(128, kernel_size=(7, 7),
                 activation='relu',
                 input_shape=(28, 28, 1,)))
model.add(Dropout(0.20))
model.add(Conv2D(64, (3, 3), activation='relu'))
model.add(Dropout(0.20))
model.add(Flatten())
model.add(Dense(units=64, activation='relu'))
model.add(Dropout(0.25))
model.add(Dense(units=10, activation='softmax'))
model.summary()
model.compile(loss='categorical_crossentropy',
              optimizer='sgd',
              metrics=['accuracy'])
model.fit(train_data, train_labels,  batch_size=100, epochs=30, validation_data=(test_data, test_labels,))

如何使用该模型进行预测,以便也能确定预测结果?我将欣赏一些实际示例(最好是在Keras中,但任何方法都可以).

How should I predict with this model so that I get its certainty about predictions too? I would appreciate some practical examples (preferably in Keras, but any will do).

为澄清起见,我正在寻找一个示例,该示例如何使用 Yurin Gal 概述的方法(或解释为什么其他方法会产生更好的结果).

To clarify, I am looking for an example of how to get certainty using the method outlined by Yurin Gal (or an explanation of why some other method yields better results).

推荐答案

如果要实施辍学方法来测量不确定性,则应执行以下操作:

If you want to implement dropout approach to measure uncertainty you should do the following:

  1. 在测试期间也应用 dropout 的实现功能:

import keras.backend as K
f = K.function([model.layers[0].input, K.learning_phase()],
               [model.layers[-1].output])

  • 将此功能用作不确定性预测变量,例如通过以下方式:

  • Use this function as uncertainty predictor e.g. in a following manner:

    def predict_with_uncertainty(f, x, n_iter=10):
        result = numpy.zeros((n_iter,) + x.shape)
    
        for iter in range(n_iter):
            result[iter] = f(x, 1)
    
        prediction = result.mean(axis=0)
        uncertainty = result.var(axis=0)
        return prediction, uncertainty
    

  • 当然,您可以使用任何其他函数来计算不确定性.

    Of course you may use any different function to compute uncertainty.

    这篇关于如何使用Keras计算预测不确定性?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

    查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆