如何在神经网络中缩放数据? [英] How to scale data in a neural network?

查看:109
本文介绍了如何在神经网络中缩放数据?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



我正在学习神经网络.如何在神经网络多层反向传播中缩放数据?我找到了用于输入和测试值的公式:

Hi,

I''m learning about neural networks. How do I scale data in a neural network multilayer backpropagation? I''ve found this formula for the input and test values:

I = Imin + (Imax-Imin)*(X-Dmin)/(Dmax-Dmin)


输入值是实数,例如在此乘法表中


Input values are real numbers like in this multiplication table

1 1 1  (1 x 1 =1)
1 2 2  (1 x 2 =4)
.
.
.
2 3 6  (2 x 3 =6)
.
.
5 5 25 (5 x 5=25)


我想知道如何对输出数据进行缩放以获取实际的输出答案?

谢谢


感谢这个答案,尽管我正在寻找好的归一化公式,但我已经开始了缩放阶段.
人们可以在nn上找到这么多论文,但是关于缩放未缩放数据的论文却很少.

我从文章
开始了这种编码 反向传播神经网络 [


I''d like to know how do I unscale output data to get the real output answers?

Thank you


thanks to this answer I''ve started my scaling phase, though I''m search for the good normalization formulas.
One can find so many papers on nn but very few ones on scaling un-scaling your data.

I''ve started this coding from the article
Back-propagation Neural Net[^]
where someone has implemented the multiplication table solved by the backpropagation algorithm.

I keep on my search,
I''m using:

(double)LO+(HI-LO)*((X - minX)/(maxX - minX));

to scale, where:
LO=-1
HI=1
minX=1
maxX=25  (last result on my multiplication table)
X=input to scale



需要找出如何取消缩放以及是否使用正确的缩放方法

好吧,再次感谢



In need to find out how to un-scale and if I''m using the right scaling method

Well, thanks again

推荐答案

在训练输入集时缩小答案,使它们在您的S形/压扁功能范围内.然后按相同的比例放大.这对输出神经元输出具有恒定的权重.
由于结果集受权重的限制,BPN不适合该任务,但这是观察网络学习的好方法.我也训练了正弦函数和逻辑运算.
Scale down the answer when training for the input set so they are within the range of your sigmoid/squashing function. Then scale up by the same factor. This acts as a constant weight on the output neuron output.
BPNs are poorly suited to this task due to the results sets being limited by the weight, but it is a good way to watche the network learn. I also trained to a sine function and logical operation too.


这篇关于如何在神经网络中缩放数据?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆