TensorFlow 中的 logits 一词是什么意思? [英] What is the meaning of the word logits in TensorFlow?

查看:37
本文介绍了TensorFlow 中的 logits 一词是什么意思?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在下面的 TensorFlow 函数中,我们必须在最后一层输入人工神经元的激活.我明白.但我不明白为什么它被称为logits?这不是数学函数吗?

loss_function = tf.nn.softmax_cross_entropy_with_logits(logits = last_layer,标签 = target_output)

解决方案

Logits 是一个重载的术语,可以表示许多不同的含义:

<小时>

在数学中,

0.5 的概率对应于 0 的 logit.负 logit 对应于小于 0.5 的概率,正到 > 0.5.

在机器学习中,它可以

<块引用>

分类的原始(非标准化)预测向量模型生成,通常然后传递给规范化功能.如果模型正在解决多类分类问题,logits 通常成为 softmax 函数的输入.这softmax 函数然后生成(归一化)概率的向量每个可能的类都有一个值.

Logits 也 有时指的是sigmoid 函数的逐元素逆函数.

In the following TensorFlow function, we must feed the activation of artificial neurons in the final layer. That I understand. But I don't understand why it is called logits? Isn't that a mathematical function?

loss_function = tf.nn.softmax_cross_entropy_with_logits(
     logits = last_layer,
     labels = target_output
)

解决方案

Logits is an overloaded term which can mean many different things:


In Math, Logit is a function that maps probabilities ([0, 1]) to R ((-inf, inf))

Probability of 0.5 corresponds to a logit of 0. Negative logit correspond to probabilities less than 0.5, positive to > 0.5.

In ML, it can be

the vector of raw (non-normalized) predictions that a classification model generates, which is ordinarily then passed to a normalization function. If the model is solving a multi-class classification problem, logits typically become an input to the softmax function. The softmax function then generates a vector of (normalized) probabilities with one value for each possible class.

Logits also sometimes refer to the element-wise inverse of the sigmoid function.

这篇关于TensorFlow 中的 logits 一词是什么意思?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆