用单层感知器解决XOR [英] solving XOR with single layer perceptron
问题描述
我一直听说,单层感知器(不使用隐藏层)无法解决XOR问题,因为它不是线性可分离的.我知道没有线性函数可以分隔类.
I've always heard that the XOR problem can not be solved by a single layer perceptron (not using a hidden layer) since it is not linearly separable. I understand that there is no linear function that can separate the classes.
但是,如果我们使用诸如sin()或cos()之类的非单调激活函数会怎样呢?我想这些类型的功能也许可以将它们分开.
However, what if we use a non-monotonic activation function like sin() or cos() is this still the case? I would imagine these types of functions might be able to separate them.
推荐答案
是,具有非单调激活函数的单层神经网络可以解决XOR问题.更具体地说,周期函数将多次剪切XY平面.甚至 Abs 或
Yes, a single layer neural network with a non-monotonic activation function can solve the XOR problem. More specifically, a periodic function would cut the XY plane more than once. Even an Abs or Gaussian activation function will cut it twice.
尝试一下:W1 = W2 = 100,Wb = -100,激活= exp(-(Wx)^ 2)
Try it yourself: W1 = W2 = 100, Wb = -100, activation = exp(-(Wx)^2)
- exp(-(100 * 0 + 100 * 0-100 * 1)^ 2)=〜0
- exp(-(100 * 0 + 100 * 1-100 * 1)^ 2)= 1
- exp(-(100 * 1 + 100 * 0-100 * 1)^ 2)= 1
- exp(-(100 * 1 + 100 * 1-100 * 1)^ 2)=〜0
或者通过abs激活:W1 = -1,W2 = 1,Wb = 0(是的,即使没有偏差也可以解决它)
Or with the abs activation: W1 = -1, W2 = 1, Wb = 0 (yes, you can solve it even without a bias)
- abs(-1 * 0 + 1 * 0)= 0
- abs(-1 * 0 + 1 * 1)= 1
- abs(-1 * 1 +1 * 0)= 1
- abs(-1 * 1 +1 * 1)= 0
或正弦:W1 = W2 = -PI/2,Wb = -PI
Or with sine: W1 = W2 = -PI/2, Wb = -PI
- sin(-PI/2 * 0-PI/2 * 0-PI * 1)= 0
- sin(-PI/2 * 0-PI/2 * 1-PI * 1)= 1
- sin(-PI/2 * 1-PI/2 * 0-PI * 1)= 1
- sin(-PI/2 * 1-PI/2 * 1-PI * 1)= 0
这篇关于用单层感知器解决XOR的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!