如果在二进制交叉熵中猜错了,则加权样本 [英] Weight samples if incorrect guessed in binary cross entropy

查看:63
本文介绍了如果在二进制交叉熵中猜错了,则加权样本的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

kerastensorflow中是否存在一种方法,如果仅对样本进行了错误分类,则可以赋予它们额外的权重. IE.类权重和样本权重的组合,但仅将样本权重应用于二元类的结果之一?

Is there a way in keras or tensorflow to give samples an extra weight if they are incorrectly classified only. Ie. a combination of class weight and sample weight but only apply the sample weight for one of the outcomes in a binary class?

推荐答案

是的,有可能.在下面,您可以找到一个示例,该示例如何为真阳性假阳性真阴性等添加额外权重:

Yes, it's possible. Below you may find an example of how to add additional weight on true positives , false positives , true negatives, etc:

def reweight(y_true, y_pred, tp_weight=0.2, tn_weight=0.2, fp_weight=1.2, fn_weight=1.2):
    # Get predictions
    y_pred_classes = K.greater_equal(y_pred, 0.5)
    y_pred_classes_float = K.cast(y_pred_classes, K.floatx())

    # Get misclassified examples
    wrongly_classified = K.not_equal(y_true, y_pred_classes_float)
    wrongly_classified_float = K.cast(wrongly_classified, K.floatx())

    # Get correctly classified examples
    correctly_classified = K.equal(y_true, y_pred_classes_float)
    correctly_classified_float = K.cast(wrongly_classified, K.floatx())

    # Get tp, fp, tn, fn
    tp = correctly_classified_float * y_true
    tn = correctly_classified_float * (1 - y_true)
    fp = wrongly_classified_float * y_true
    fn = wrongly_classified_float * (1 - y_true)

    # Get weights
    weight_tensor = tp_weight * tp + fp_weight * fp + tn_weight * tn + fn_weight * fn

    loss = K.binary_crossentropy(y_true, y_pred)
    weighted_loss = loss * weight_tensor
    return weighted_loss

这篇关于如果在二进制交叉熵中猜错了,则加权样本的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆