Keras-批次内每个样品的损失 [英] Keras- Loss per sample within batch

查看:99
本文介绍了Keras-批次内每个样品的损失的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我如何在训练时得到样本损失而不是总损失?可以使用损失历史记录,该历史记录给出了总的批次损失,但没有提供单个样品的损失.

How do I get the sample loss while training instead of the total loss? The loss history is available which gives the total batch loss but it doesn't provide the loss for individual samples.

如果可能的话,我想要这样的东西:

If possible I would like to have something like this:

on_batch_end(batch, logs, **sample_losses**)

是否有类似的东西?如果不能,您是否可以提供一些提示,说明如何更改代码以支持这一点?

Is something like this available and if not can you provide some hints how to change the code to support this?

推荐答案

据我所知,不可能通过回调获取此信息,因为一旦调用了回调,损失已经计算完了(请参阅keras/engine/training.py).为了简单地检查损失,您可以覆盖损失函数,例如:

To the best of my knowledge it is not possible to get this information via callbacks since the loss is already computed once the callbacks are called (have a look at keras/engine/training.py). To simply inspect the losses you may override the loss function, e.g.:

def myloss(ytrue, ypred):
    x = keras.objectives.mean_squared_error(ytrue, ypred)
    return theano.printing.Print('loss for each sample')(x)

model.compile(loss=myloss)

这篇关于Keras-批次内每个样品的损失的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆