对应于非一维编码标签的分类交叉熵 [英] Counterpart to categorical crossentropy for not one-hot encoded labels

查看:90
本文介绍了对应于非一维编码标签的分类交叉熵的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在用KERAS构建一个神经网络,其中我的标签是矢量,其中6个值正好是1,而其他所有值(大约7000个)都为零.我目前正在使用categorical_crossentropy作为损失函数,但文档中说:

I'm building a neural network with KERAS, where my labels are vectors, where exactly 6 values are 1, while all the other values (around 7000) are zero. I'm currently using categorical_crossentropy as my loss function but the documentation says:

注意:使用categorical_crossentropy损失时,目标应采用分类格式(例如,如果您有10个类别,则每个样本的目标应为全零的10维向量,除了在1处为1与样本类别相对应的索引).

Note: when using the categorical_crossentropy loss, your targets should be in categorical format (e.g. if you have 10 classes, the target for each sample should be a 10-dimensional vector that is all-zeros except for a 1 at the index corresponding to the class of the sample).

如果categoreical_crossentropy仅是单热编码标签的正确方法,那么正确"的错误函数是什么?

So what would be the "right" error function if categoreical_crossentropy is only the right way for one-hot encoded labels?

推荐答案

您可以使用sparse_categorical_crossentropy作为损失,它接受整数类索引而不是一键编码的索引.

You can use sparse_categorical_crossentropy as loss, which accepts integer class indices instead of one-hot encoded ones.

这篇关于对应于非一维编码标签的分类交叉熵的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆