多层神经网络不会预测负值 [英] Multi-layer neural network won't predict negative values
问题描述
例如输入=< 0,1,-1,0,1>输出= Sin(0 + 1 +(-1)+ 0 + 1)
我遇到的问题是网络永远不会预测负值,并且许多向量的正弦值都是负值.它可以完美地预测所有正输出或零输出.我假设在更新权重时会出现问题,权重会在每个时期后更新.有没有人遇到过NN的问题?任何帮助都很好!!
注意:该网络有5个输入,1个隐藏层有6个隐藏单元和1个输出.我在激活隐藏层和输出层上使用了S型函数,并尝试了数吨的学习率(当前为0.1);
由于我研究多层感知器已经很长时间了,因此请带些盐.
我会将您的问题域重新调整为[0,1]域,而不是[-1,1].如果您看一下逻辑功能图:
它生成介于[0,1]之间的值.我不期望它会产生负面结果.我可能是错的,强硬.
编辑:
您实际上可以将后勤功能扩展到您的问题域.使用广义逻辑曲线将A和K参数设置为您域的边界.
另一种选择是双曲正切,它的取值范围为[-1,+ 1],并且没有要设置的常数.
I have implemented a multilayer perceptron to predict the sin of input vectors. The vectors consist of four -1,0,1's chosen at random and a bias set to 1. The network should predict the sin of sum of the vectors contents.
eg Input = <0,1,-1,0,1> Output = Sin(0+1+(-1)+0+1)
The problem I am having is that the network will never predict a negative value and many of the vectors' sin values are negative. It predicts all positive or zero outputs perfectly. I am presuming that there is a problem with updating the weights, which are updated after every epoch. Has anyone encountered this problem with NN's before? Any help at all would be great!!
note: The network has 5inputs,6hidden units in 1 hidden layer and 1 output.I am using a sigmoid function on the activations hidden and output layers, and have tried tonnes of learning rates (currently 0.1);
Being a long time since I looked into multilayer perceptrons hence take this with a grain of salt.
I'd rescale your problem domain to the [0,1] domain instead of [-1,1]. If you take a look at the logistic function graph:
It generates values between [0,1]. I do not expect it to produce negative results. I might be wrong, tough.
EDIT:
You can actually extend the logistic function to your problem domain. Use the generalized logistic curve setting A and K parameters to the boundaries of your domain.
Another option is the hyperbolic tangent, which goes from [-1,+1] and has no constants to set up.
这篇关于多层神经网络不会预测负值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!