如何在 Tensorflow 中实现 PReLU 激活? [英] How to implement PReLU activation in Tensorflow?

查看:47
本文介绍了如何在 Tensorflow 中实现 PReLU 激活?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

参数整流线性单元 (PReLU) 是一种有趣且广泛使用的激活函数.似乎 Tensorflow (参考链接) 不提供 PReLU.我知道更高级别的库,例如 KerasTFLearn,有它的实现.

The Parametric Rectified Linear Unit (PReLU) is an interesting and widely used activation function. It seems that Tensorflow (reference link) does not provide PReLU. I know that the higher level libraries, such as Keras and TFLearn, has the implementation of it.

我想知道如何在 Tensorflow 中实现 PReLU?

I would like to know how to implement PReLU in Tensorflow?

推荐答案

PReLU 的实现基于 PreLU 实现 似乎很简单(请参阅:Keras, TFLearnTensorLayer) 的更高级别的库.我的代码如下:

The implementation of PReLU seems straight-forward based on the PreLU implementations (see: Keras, TFLearn and TensorLayer) of the higher level libraries. My code is as follows:

def parametric_relu(_x):
  alphas = tf.get_variable('alpha', _x.get_shape()[-1],
                       initializer=tf.constant_initializer(0.0),
                        dtype=tf.float32)
  pos = tf.nn.relu(_x)
  neg = alphas * (_x - abs(_x)) * 0.5

  return pos + neg

这篇关于如何在 Tensorflow 中实现 PReLU 激活?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆