如何在神经网络中实现与假阳性和假阴性平衡有关的事实? [英] How to implement fact related to false positive vs. false negative balance in neural network?

查看:245
本文介绍了如何在神经网络中实现与假阳性和假阴性平衡有关的事实?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个是/否分类问题,其中误报误报更糟糕.

I have a yes/no classification problem, where false positives are worse than false negatives.

有没有一种方法可以将这一事实实现到神经网络中,特别是在MATLAB的Neural Network Toolbox中?

Is there a way to implement this fact into neural network especially in MATLAB's Neural Network Toolbox?

推荐答案

您需要的是一个对成本敏感的元分类器(元分类器可与任何任意分类器配合使用,无论是ANN, SVM或任何其他).

What you need is a cost-sensitive meta-classifier (a meta-classifier works with any arbitrary classifier, be it ANN, SVM, or any other).

这可以通过两种方式完成:

This can be done in two ways:

    根据成本矩阵
  • 重新加权训练实例.这是通过对数据进行重新采样来实现的,这样一个特定的类就被过度表示,因此,与其他类相比,构建的模型对那个特定的类更加敏感.
  • 用最小预期错误分类成本(而不是最可能的类别)来预测类别.这里的想法是通过频繁发生廉价错误和较少频繁错误而使总预期成本最小化.
  • re-weighting training instances according to a cost matrix. This is done by resampling the data so that a particular class is over represented, thus the model built is more sensitive to that particular class as opposed to the other classes.
  • predicting the class with minimum expected misclassification cost (rather than the most likely class). The idea here is to minimize the total expected costs by making cheap mistakes more often and expensive mistakes less often.

实现第一种学习方法的一种算法是 SECOC ,它使用纠错码;而第二种方法的示例是 MetaCost ,它使用装袋改进分类器的概率估计.

One algorithm that implements the first learning approach is SECOC, which uses error-correcting codes; while an example of the second approach is the MetaCost which uses bagging to improve the probability estimates of the classifier.

这篇关于如何在神经网络中实现与假阳性和假阴性平衡有关的事实?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆