如何在keras语义分割中获得单一类的信息? [英] How to get iou of single class in keras semantic segmentation?

查看:53
本文介绍了如何在keras语义分割中获得单一类的信息?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用

该模型似乎训练得很好,但准确性随着时间的推移而降低.

还可以有人帮助解释如何从 y_true y_pred 中计算出指标得分吗?我不太了解何时标签值用于IoU指标计算.

解决方案

当时我也遇到了类似的问题.我使用了 jaccard_distance_loss dice_metric .它们基于IoU.我的任务是二进制分段,所以我想您可能需要修改代码,以防万一要用于多标签分类问题.

来自keras的

 导入后端为Kdef jaccard_distance_loss(y_true,y_pred,smooth = 100):"Jaccard =(| X& Y |)/(| X | + | Y |--X& Y |)= sum(| A * B |)/(sum(| A |)+ sum(| B |)-sum(| A * B |))雅卡距离损失对于不平衡的数据集很有用.这是偏移以使其收敛于0并进行平滑处理以避免爆炸或消失坡度.参考:https://en.wikipedia.org/wiki/Jaccard_index@url:https://gist.github.com/wassname/f1452b748efcbeb4cb9b1d059dce6f96@author:wassname"相交= K.sum(K.sum(K.abs(y_true * y_pred),轴= -1))sum_ = K.sum(K.sum(K.abs(y_true)+ K.abs(y_pred),轴= -1))jac =(交点+平滑)/(sum_-交点+平滑)回报(1-江淮)*顺利def dice_metric(y_pred,y_true):相交= K.sum(K.sum(K.abs(y_true * y_pred),轴= -1))联合= K.sum(K.sum(K.abs(y_true)+ K.abs(y_pred),轴= -1))#如果y_pred.sum()== 0和y_pred.sum()== 0:#返回1.0返回2 *交集/并集# 例子大小= 10y_true = np.zeros(形状=(大小,大小))y_true [3:6,3:6] = 1y_pred = np.zeros(shape =(size,size))y_pred [3:5,3:5] = 1损失= jaccard_distance_loss(y_true,y_pred)指标= dice_metric(y_pred,y_true)打印(f" floss:{loss}")print(f"dice_metric:{metric}") 

 损失:4.587155963302747dice_metric:0.6153846153846154 

I am using the Image segmentation guide by fchollet to perform semantic segmentation. I have attempted modifying the guide to suit my dataset by labelling the 8-bit img mask values into 1 and 2 like in the Oxford Pets dataset. (which will be subtracted to 0 and 1 in class OxfordPets(keras.utils.Sequence):)

Question is how do I get the IoU metric of a single class (e.g 1)?

I have tried different metrics suggested by Stack Overflow but most of suggest using MeanIoU which I tried but I have gotten nan loss as a result. Here is an example of a mask after using autocontrast. PIL.ImageOps.autocontrast(load_img(val_target_img_paths[i]))

The model seems to train well but the accuracy was decreasing over time.

Also, can someone help explain how the metric score can be calculated from y_true and y_pred? I don't quite fully understand when the label value is used in the IoU metric calculation.

解决方案

I had a similar problem back then. I used jaccard_distance_loss and dice_metric. They are based on IoU. My task was a binary segmentation, so I guess you might have to modify the code in case you want to use it for a multi-label classification problem.

from keras import backend as K

def jaccard_distance_loss(y_true, y_pred, smooth=100):
    """
    Jaccard = (|X & Y|)/ (|X|+ |Y| - |X & Y|)
            = sum(|A*B|)/(sum(|A|)+sum(|B|)-sum(|A*B|))
    
    The jaccard distance loss is usefull for unbalanced datasets. This has been
    shifted so it converges on 0 and is smoothed to avoid exploding or disapearing
    gradient.
    
    Ref: https://en.wikipedia.org/wiki/Jaccard_index
    
    @url: https://gist.github.com/wassname/f1452b748efcbeb4cb9b1d059dce6f96
    @author: wassname
    """
    intersection = K.sum(K.sum(K.abs(y_true * y_pred), axis=-1))
    sum_ = K.sum(K.sum(K.abs(y_true) + K.abs(y_pred), axis=-1))
    jac = (intersection + smooth) / (sum_ - intersection + smooth)
    return (1 - jac) * smooth

def dice_metric(y_pred, y_true):
    intersection = K.sum(K.sum(K.abs(y_true * y_pred), axis=-1))
    union = K.sum(K.sum(K.abs(y_true) + K.abs(y_pred), axis=-1))
    # if y_pred.sum() == 0 and y_pred.sum() == 0:
    #     return 1.0

    return 2*intersection / union

# Example
size = 10

y_true = np.zeros(shape=(size,size))
y_true[3:6,3:6] = 1

y_pred = np.zeros(shape=(size,size))
y_pred[3:5,3:5] = 1

loss = jaccard_distance_loss(y_true,y_pred)

metric = dice_metric(y_pred,y_true)

print(f"loss: {loss}")
print(f"dice_metric: {metric}")

loss: 4.587155963302747
dice_metric: 0.6153846153846154

这篇关于如何在keras语义分割中获得单一类的信息?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆