Theano HiddenLayer激活功能 [英] Theano HiddenLayer Activation Function

查看:74
本文介绍了Theano HiddenLayer激活功能的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

无论如何,是否可以使用整流线性单位(ReLU)作为隐藏层的激活功能,而不是使用Theano中的tanh()sigmoid()?隐藏层的实现如下,据我在互联网上搜索到的,在Theano内部未实现ReLU.

Is there anyway to use Rectified Linear Unit (ReLU) as the activation function of the hidden layer instead of tanh() or sigmoid() in Theano? The implementation of the hidden layer is as follows and as far as I have searched on the internet ReLU is not implemented inside the Theano.

class HiddenLayer(object):
  def __init__(self, rng, input, n_in, n_out, W=None, b=None, activation=T.tanh):
    pass

推荐答案

relu在Theano中很容易做到:

relu is easy to do in Theano:

switch(x<0, 0, x)

要在您的情况下使用它,请制作一个将实现relu并将其传递给激活的python函数:

To use it in your case make a python function that will implement relu and pass it to activation:

def relu(x):
    return theano.tensor.switch(x<0, 0, x)
HiddenLayer(..., activation=relu)

有人使用此实现:x * (x > 0)

更新:Theano的较新版本具有theano.tensor.nnet.relu(x)可用.

UPDATE: Newer Theano version have theano.tensor.nnet.relu(x) available.

这篇关于Theano HiddenLayer激活功能的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆