使用“激活功能混合物"是否有意义?通过前馈神经网络近似未知函数? [英] Does it make sense to use an "activation function cocktail" for approximating an unknown function through a feed-forward neural network?

查看:76
本文介绍了使用“激活功能混合物"是否有意义?通过前馈神经网络近似未知函数?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚刚开始研究神经网络,并且正如我期望的那样,为了有效地训练神经网络,近似函数与激活函数之间必须存在某种关系.

I just started playing around with neural networks and, as I would expect, in order to train a neural network effectively there must be some relation between the function to approximate and activation function.

例如,当逼近cos(x)或两个tanh(x)逼近高斯时,使用sin(x)作为激活函数有很好的结果.现在,为了近似一个我一无所知的函数,我计划使用一系列激活函数,例如一个隐藏的层,其中包含一些罪过,某些tanh和一个逻辑函数.您认为这有意义吗?

For instance, I had good results using sin(x) as an activation function when approximating cos(x), or two tanh(x) to approximate a gaussian. Now, to approximate a function about which I know nothing I am planning to use a cocktail of activation functions, for instance a hidden layer with some sin, some tanh and a logistic function. In your opinion does this make sens?

谢谢,
吞努兹

Thank you,
Tunnuz

推荐答案

不同的激活功能确实具有不同的优点(主要是出于生物学上的合理性或诸如

While it is true that different activation functions have different merits (mainly for either biological plausibility or a unique network design like radial basis function networks), in general you be able to use any continuous squashing function and expect to be able to approximate most functions encountered in real world training data.

两个最流行的选择是双曲正切和逻辑函数,因为它们都具有易于计算的导数和绕轴的有趣行为.

The two most popular choices are the hyperbolic tangent and the logistic function, since they both have easily calculable derivatives and interesting behavior around the axis.

如果这两种方法都不能使您准确地估算出功能,那么我的第一个反应就是不要更改激活功能.相反,您应该首先调查您的训练集和网络训练参数(学习率,每个池中的单元数,体重下降,动量等).

If neither if those allows you to accurately approximate your function, my first response wouldn't be to change activation functions. Rather, you should first investigate your training set and network training parameters (learning rates, number of units in each pool, weight decay, momentum, etc.).

如果您仍然卡住,请退后一步,并确保使用正确的体系结构(前馈vs.简单递归vs.全递归)和学习算法(时间上的反向传播vs.反向传播vs.对比的hebbian vs.进化/全局方法).

If your still stuck, step back and make sure your using the right architecture (feed forward vs. simple recurrent vs. full recurrent) and learning algorithm (back-propagation vs. back-prop through time vs. contrastive hebbian vs. evolutionary/global methods).

一个侧面说明:请确保您从不使用线性激活功能(输出层或疯狂的简单任务除外),因为它们具有非常

One side note: Make sure you never use a linear activation function (except for output layers or crazy simple tasks), as these have very well documented limitations, namely the need for linear separability.

这篇关于使用“激活功能混合物"是否有意义?通过前馈神经网络近似未知函数?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆