在神经网络中,偏置项有权重吗? [英] In neural networks, do bias terms have a weight?

查看:197
本文介绍了在神经网络中,偏置项有权重吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试编写自己的神经网络,但是我对偏见术语不了解.我知道一个神经元到另一个神经元的每个链接都有权重,但是在偏倚和与其相连的神经元之间的链接也有权重吗?还是我能想到重量始终为1且永远不变?

I am trying to code my own neural networks, but there is something I don't understand about bias terms. I know each link of a neuron to another neuron has a weight, but does a link between a bias and the neuron its connected to also have a weight? Or can I think of the weight always 1 and never gets changed?

谢谢

推荐答案

偏差项确实具有权重,通常,您会向隐藏层中的每个神经元以及输出层中的神经元添加偏差(在挤压之前) ).

The bias terms do have weights, and typically, you add bias to every neuron in the hidden layers as well as the neurons in the output layer (prior to squashing).

看看人工神经元的基本结构,您会发现偏见有所增加作为wk0 = bk.有关更详尽的示例,请参见此链接,其中包含公式以及可视化多层NN.

Have a look at the basic structure of Artificial Neurons, you see the bias is added as wk0 = bk. For more thorough examples, see e.g. this link, containing formulas as well as visualisation of multi-layered NN.

有关权重选择的讨论,请参见以下 stats.stackexchange 线程:

For the discussion of choice of weights, refer to the following stats.stackexchange thread:

这篇关于在神经网络中,偏置项有权重吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆