具有Sigmoid激活的神经网络是否使用阈值? [英] Does a Neural Network with Sigmoid Activation use Thresholds?

查看:283
本文介绍了具有Sigmoid激活的神经网络是否使用阈值?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有点困惑.我刚开始学习神经网络这一主题,而我构建的第一个主题是对每个神经元使用带有阈值的逐步激活.现在,我将不执行S型激活,但是这种类型的激活似乎不使用阈值,而仅使用神经元之间的权重.但是,在我发现的有关此信息的信息中,有一个阈值的字眼,只有我找不到它们在激活函数中的位置.

I'm a tad confused here. I just started on the subject of Neural Networks and the first one I constructed used the Step-Activation with thresholds on each neuron. Now I wan't to implement the sigmoid activation but it seems that this type of activation doesn't use thresholds, only weights between the neurons. But in the information I find about this there is word of thresholds, only I can't find where they should be in the activation function.

神经网络中的S形激活函数中是否使用了阈值?

推荐答案

没有像步骤激活那样的离散跳转.该阈值可以被认为是S形函数为0.5的点.某些Sigmoid函数会将其设置为0,而某些Sigmoid函数会将其设置为不同的阈值".

There is no discrete jump as in step activation. The threshold could be considered to be the point where the sigmoid function is 0.5. Some sigmoid functions will have this at 0, while some will have it set to a different 'threshold'.

阶跃函数可以认为是S型函数的一种形式,其陡度设置为无穷大.在这种情况下,存在一个明显的阈值,对于不那么陡峭的S型函数,该阈值可以认为是该函数的值为0.5或最大陡度的点.

The step function may be thought of as a version of the sigmoid function that has the steepness set to infinity. There is an obvious threshold in this case, and for less steep sigmoid functions, the threshold could be considered to be where the function's value is 0.5, or the point of maximum steepness.

这篇关于具有Sigmoid激活的神经网络是否使用阈值?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆