神经网络输出:缩放输出范围 [英] Neural Network Output :Scaling the output range

查看:355
本文介绍了神经网络输出:缩放输出范围的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的神经网络的输出层(3层)使用sigmoid作为激活,仅在[0-1]范围内输出.但是,如果我想训练它以超过[0-1]的输出(成千上万个),我应该怎么做?

The output layer of my neural network (3 layered) is using sigmoid as activation which outputs only in range [0-1]. However, if I want to train it for outputs that are beyond [0-1], say in thousands, what should I do?

例如,如果我想训练

输入---->输出

0 0 ------> 0

0 0 ------> 0

0 1 ------> 1000

0 1 ------> 1000

1000 1 ----> 1

1000 1 ----> 1

1 1 -------> 0

1 1 -------> 0

我的程序适用于AND,OR,XOR等.由于输入输出均为二进制.

My program works for AND, OR, XOR etc. As input output are all in binary.

有一些建议可以使用,

激活:

y = lambda *(abs(x) 1/(1 + exp(-1 (x))))

y = lambda*(abs(x)1/(1+exp(-1(x))))

激活衍生物:

lambda *(abs(y) y (1-y))

lambda*(abs(y)y(1-y))

对于所提到的训练模式,这没有收敛(如果我没有做错任何事情).请问有什么建议吗?

This did not converge for the mentioned training pattern (if I have not done anything wrong). Are there any suggestion please?

推荐答案

对于分类问题,习惯上在输出层中使用Sigmoid/逻辑激活函数来获得适当的概率值,范围为[0,1];结合N分类中的1-of编码进行多类分类,每个节点输出将代表实例属于每个类值的概率.

For classification problems, it is customary to use a sigmoid/logistic activation function in the output layer to get proper probability values in the range [0,1]; coupled with 1-of-N encoding for multi-class classification, each node output would represent the probability of the instance belonging to each class value.

另一方面,如果您遇到回归问题,则无需在输出上应用其他函数,而只需获取原始线性组合输出即可.网络将自动学习权重,以提供您拥有的任何输出值(甚至是数千个).

On the other hand, if you have a regression problem, there is no need to apply additional functions on the output, and you can just take the raw linear combination output. The network will automatically learn the weights to give whatever output values you have (even in the thousands).

您还应该注意缩放输入要素(例如,将所有要素归一化为[-1,1]范围).

What you should also be careful about is to scale the input features (by normalizing all features to the range [-1,1] for example).

这篇关于神经网络输出:缩放输出范围的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆