如何使用 Keras 计算预测不确定性? [英] How to calculate prediction uncertainty using Keras?

查看:46
本文介绍了如何使用 Keras 计算预测不确定性?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想计算 NN 模型的确定性/置信度(参见 我的深层模型是什么不知道) - 当神经网络告诉我一个图像代表8"时,我想知道它有多确定.我的模型是 99% 确定它是8"还是 51% 是8",但它也可能是6"?有些数字很模糊,我想知道模型只是在抛硬币"的哪些图像.

I would like to calculate NN model certainty/confidence (see What my deep model doesn't know) - when NN tells me an image represents "8", I would like to know how certain it is. Is my model 99% certain it is "8" or is it 51% it is "8", but it could also be "6"? Some digits are quite ambiguous and I would like to know for which images the model is just "flipping a coin".

我找到了一些关于此的理论著作,但我无法将其放入代码中.如果我理解正确,我应该多次评估测试图像,同时杀死"不同的神经元(使用 dropout)然后......?

I have found some theoretical writings about this but I have trouble putting this in code. If I understand correctly, I should evaluate a testing image multiple times while "killing off" different neurons (using dropout) and then...?

在 MNIST 数据集上工作,我正在运行以下模型:

Working on MNIST dataset, I am running the following model:

from keras.models import Sequential
from keras.layers import Dense, Activation, Conv2D, Flatten, Dropout

model = Sequential()
model.add(Conv2D(128, kernel_size=(7, 7),
                 activation='relu',
                 input_shape=(28, 28, 1,)))
model.add(Dropout(0.20))
model.add(Conv2D(64, (3, 3), activation='relu'))
model.add(Dropout(0.20))
model.add(Flatten())
model.add(Dense(units=64, activation='relu'))
model.add(Dropout(0.25))
model.add(Dense(units=10, activation='softmax'))
model.summary()
model.compile(loss='categorical_crossentropy',
              optimizer='sgd',
              metrics=['accuracy'])
model.fit(train_data, train_labels,  batch_size=100, epochs=30, validation_data=(test_data, test_labels,))

我应该如何使用此模型进行预测,以便我也能确定预测?我希望能提供一些实际示例(最好在 Keras 中,但任何示例都可以).

How should I predict with this model so that I get its certainty about predictions too? I would appreciate some practical examples (preferably in Keras, but any will do).

为了澄清,我正在寻找一个示例,说明如何使用 Yurin Gal 概述的方法(或解释为什么其他方法会产生更好的结果).

To clarify, I am looking for an example of how to get certainty using the method outlined by Yurin Gal (or an explanation of why some other method yields better results).

推荐答案

如果您想实施 dropout 方法来测量不确定性,您应该执行以下操作:

If you want to implement dropout approach to measure uncertainty you should do the following:

  1. 实现在测试期间也应用dropout的函数:

import keras.backend as K
f = K.function([model.layers[0].input, K.learning_phase()],
               [model.layers[-1].output])

  • 将此函数用作不确定性预测器,例如方式如下:

  • Use this function as uncertainty predictor e.g. in a following manner:

    def predict_with_uncertainty(f, x, n_iter=10):
        result = numpy.zeros((n_iter,) + x.shape)
    
        for iter in range(n_iter):
            result[iter] = f(x, 1)
    
        prediction = result.mean(axis=0)
        uncertainty = result.var(axis=0)
        return prediction, uncertainty
    

  • 当然,您可以使用任何不同的函数来计算不确定性.

    Of course you may use any different function to compute uncertainty.

    这篇关于如何使用 Keras 计算预测不确定性?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

    查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆