交叉熵和对数损失误差有什么区别? [英] What is the difference between cross-entropy and log loss error?

查看:170
本文介绍了交叉熵和对数损失误差有什么区别?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

交叉熵和对数丢失误差有什么区别?两者的公式似乎非常相似.

What is the difference between cross-entropy and log loss error? The formulae for both seem to be very similar.

推荐答案

它们本质上是相同的;通常,对于二元分类问题,我们使用术语 log loss ,对于多类分类的一般情况,我们使用更通用的交叉熵(损失),但是即使这样,区分是不一致的,并且您经常会发现这些术语可作为同义词互换使用.

They are essentially the same; usually, we use the term log loss for binary classification problems, and the more general cross-entropy (loss) for the general case of multi-class classification, but even this distinction is not consistent, and you'll often find the terms used interchangeably as synonyms.

来自有关交叉熵的维基百科条目:

逻辑损失有时称为交叉熵损失.也称为对数丢失

The logistic loss is sometimes called cross-entropy loss. It is also known as log loss

fast.ai Wiki关于日志丢失的条目:

对数丢失和交叉熵根据上下文而略有不同,但是在机器学习中,计算0到1之间的错误率时,它们可以解析为同一事物.

Log loss and cross-entropy are slightly different depending on the context, but in machine learning when calculating error rates between 0 and 1 they resolve to the same thing.

ML速查表:

交叉熵损失(即对数损失)衡量的是输出为0到1之间的概率值的分类模型的性能.

Cross-entropy loss, or log loss, measures the performance of a classification model whose output is a probability value between 0 and 1.

这篇关于交叉熵和对数损失误差有什么区别?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆