将TensorFlow损失全局目标(recall_at_precision_loss)与Keras(而非指标)一起使用 [英] Use TensorFlow loss Global Objectives (recall_at_precision_loss) with Keras (not metrics)

查看:234
本文介绍了将TensorFlow损失全局目标(recall_at_precision_loss)与Keras(而非指标)一起使用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有5个标签(例如[1 0 1 1 0])的多标签分类问题.因此,我希望我的模型在固定召回率,精确召回AUC或ROC AUC等指标上有所改进.

I have a multi-label classification problem with 5 labels (e.g. [1 0 1 1 0]). Therefore, I want my model to improve at metrics such as fixed recall, precision-recall AUC or ROC AUC.

使用与我要优化的性能指标没有直接关系的损耗函数(例如binary_crossentropy)没有意义.因此,我想使用TensorFlow的global_objectives.recall_at_precision_loss()或类似的作为损失函数.

It doesn't make sense to use a loss function (e.g. binary_crossentropy) that is not directly related to the performance measurement I want to optimize. Therefore, I want to use TensorFlow's global_objectives.recall_at_precision_loss() or similar as loss function.

  • Relevant GitHub: https://github.com/tensorflow/models/tree/master/research/global_objectives
  • Relevant paper (Scalable Learning of Non-Decomposable Objectives): https://arxiv.org/abs/1608.04802

我不是要实现tf.metrics.我已经成功完成了以下操作: https://stackoverflow.com/a/50566908/3399066

I'm not looking for implementing a tf.metrics. I already succeeded in that following: https://stackoverflow.com/a/50566908/3399066

我认为我的问题可以分为两个问题:

I think my issue can be divided into 2 problems:

  1. 如何使用global_objectives.recall_at_precision_loss()或类似版本?
  2. 如何在带有TF后端的Keras模型中使用它?
  1. How to use global_objectives.recall_at_precision_loss() or similar?
  2. How to use it in a Keras model with TF backend?

问题1

全球目标GitHub页面上有一个名为loss_layers_example.py的文件(与上述相同).但是,由于我对TF没有太多的经验,所以我不太了解如何使用它.另外,对TensorFlow recall_at_precision_loss exampleTensorFlow Global objectives example进行谷歌搜索不会给我任何更清晰的示例.

Problem 1

There is a file called loss_layers_example.py on the global objectives GitHub page (same as above). However, since I don't have much experience with TF, I don't really understand how to use it. Also, Googling for TensorFlow recall_at_precision_loss example or TensorFlow Global objectives example won't give me any clearer example.

在一个简单的TF示例中如何使用global_objectives.recall_at_precision_loss()?

How do I use global_objectives.recall_at_precision_loss() in a simple TF example?

(在Keras中):model.compile(loss = ??.recall_at_precision_loss, ...)就足够了吗? 我的感觉告诉我,由于使用了loss_layers_example.py中的全局变量,因此要复杂得多.

Would something like (in Keras): model.compile(loss = ??.recall_at_precision_loss, ...) be enough? My feeling tells me it is more complex than that, due to the use of global variables used in loss_layers_example.py.

如何在Keras中使用类似于global_objectives.recall_at_precision_loss()的损失函数?

How to use loss functions similar to global_objectives.recall_at_precision_loss() in Keras?

推荐答案

与Martino的答案类似,但将从输入中推断形状(将其设置为固定的批量大小对我不起作用).

Similar to Martino's answer, but will infer shape from input (setting it to a fixed batch size did not work for me).

外部函数不是绝对必要的,但是在配置损失函数时传递参数会更自然,尤其是在外部模块中定义了包装器的情况下.

The outside function isn't strictly necessary, but it feels a bit more natural to pass params as you configure the loss function, especially when your wrapper is defined in an external module.

import keras.backend as K
from global_objectives.loss_layers import precision_at_recall_loss

def get_precision_at_recall_loss(target_recall): 
    def precision_at_recall_loss_wrapper(y_true, y_pred):
        y_true = K.reshape(y_true, (-1, 1)) 
        y_pred = K.reshape(y_pred, (-1, 1))   
        return precision_at_recall_loss(y_true, y_pred, target_recall)[0]
    return precision_at_recall_loss_wrapper

然后,在编译模型时:

TARGET_RECALL = 0.9
model.compile(optimizer='adam', loss=get_precision_at_recall_loss(TARGET_RECALL))

这篇关于将TensorFlow损失全局目标(recall_at_precision_loss)与Keras(而非指标)一起使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆