偏差在神经网络中的作用是什么? [英] What is the role of the bias in neural networks?

查看:361
本文介绍了偏差在神经网络中的作用是什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我知道梯度下降和反向传播算法.我不明白的是:什么时候使用偏见很重要,以及如何使用它?

例如,当映射AND函数时,当我使用2个输入和1个输出时,它不能给出正确的权重,但是,当我使用3个输入(其中1个是偏差)时,它给出正确的重量.

解决方案

我认为偏见几乎总是有帮助的.实际上,偏差值使您可以将激活功能向左或向右移动,这对于成功学习至关重要.

看一个简单的例子可能会有所帮助.考虑一下这种无偏差的1输入1输出网络:

网络的输出是通过将输入(x)乘以权重(w 0 )并将结果传递给某种激活函数(例如S型函数)来计算的.

这是该网络针对w 0 的各种值计算的函数:

更改权重w 0 实质上会改变S型曲线的陡度".这很有用,但是如果您希望网络在x为2时输出0,该怎么办?只是改变S型曲线的陡度是行不通的-您希望能够将整个曲线向右移动.

这正是偏差允许您执行的操作.如果我们向该网络添加偏见,就像这样:

...然后网络的输出变为sig(w 0 * x + w 1 * 1.0).这是w 1 的各种值的网络输出的样子:

w 1 的权重为-5,则曲线向右移动,这使我们拥有一个在x为2时输出0的网络.

I'm aware of the gradient descent and the back-propagation algorithm. What I don't get is: when is using a bias important and how do you use it?

For example, when mapping the AND function, when I use 2 inputs and 1 output, it does not give the correct weights, however, when I use 3 inputs (1 of which is a bias), it gives the correct weights.

解决方案

I think that biases are almost always helpful. In effect, a bias value allows you to shift the activation function to the left or right, which may be critical for successful learning.

It might help to look at a simple example. Consider this 1-input, 1-output network that has no bias:

The output of the network is computed by multiplying the input (x) by the weight (w0) and passing the result through some kind of activation function (e.g. a sigmoid function.)

Here is the function that this network computes, for various values of w0:

Changing the weight w0 essentially changes the "steepness" of the sigmoid. That's useful, but what if you wanted the network to output 0 when x is 2? Just changing the steepness of the sigmoid won't really work -- you want to be able to shift the entire curve to the right.

That's exactly what the bias allows you to do. If we add a bias to that network, like so:

...then the output of the network becomes sig(w0*x + w1*1.0). Here is what the output of the network looks like for various values of w1:

Having a weight of -5 for w1 shifts the curve to the right, which allows us to have a network that outputs 0 when x is 2.

这篇关于偏差在神经网络中的作用是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆