改变keras RELU激活函数的阈值 [英] Change the threshold value of the keras RELU activation function

查看:92
本文介绍了改变keras RELU激活函数的阈值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在构建我的神经网络时更改激活函数 Relu 的阈值.

所以,最初的代码是下面写的,其中 relu 阈值的默认值为 0.

model = Sequential([Dense(n_inputs, input_shape=(n_inputs, ), activation = 'relu'),Dense(32, activation = 'relu'),密集(2,激活=softmax")])

然而,Keras 提供了相同的功能实现,可以参考

因此,我将代码更改为以下代码以传递自定义函数,结果却出现以下错误.

from keras.activations import relu模型 = 顺序([Dense(n_inputs, input_shape=(n_inputs, ), activation = relu(threshold = 2)),密集(32,激活= relu(阈值= 2)),密集(2,激活=softmax")])

错误: TypeError:relu() 缺少 1 个必需的位置参数:'x'

我理解错误是我没有在 relu 函数中使用 x 但我无法传递类似的东西.语法要求我编写 model.add(layers.Activation(activations.relu)) 但我将无法更改阈值.这是我需要解决方法或解决方案的地方.

然后我使用了对我有用的 ReLU 函数的层实现如下所述,但我想知道是否有一种方法可以使激活函数实现工作,因为添加层并不总是很方便,我想在 Dense 函数内部进行更多修改.

对我有用的代码:-

from keras.layers 导入 ReLU模型 = 顺序([密集(n_inputs, input_shape=(n_inputs, )),ReLU(阈值=4),密集(32),ReLU(阈值=4),密集(2,激活=softmax")])

解决方案

您面临的错误是合理的.但是,您可以在 relu 函数上使用以下技巧来完成您的工作.通过这种方式,您定义了一个接受必要参数的函数,例如 alphathreshold 等,并且在函数体中,您定义了另一个计算 的函数relu 用这些参数激活,最后返回上层函数.

# help(tf.keras.backend.relu)从 tensorflow.keras 导入后端为 Kdef relu_advanced(alpha=0.0, max_value=None, threshold=0):def relu_plus(x):返回 K.relu(x,alpha = tf.cast(alpha, tf.float32),最大值 = 最大值,阈值= tf.cast(阈值,tf.float32))返回 relu_plus

示例:

foo = tf.constant([-10, -5, 0.0, 5, 10], dtype = tf.float32)tf.keras.activations.relu(foo).numpy()数组([ 0., 0., 0., 5., 10.], dtype=float32)x = relu_advanced(阈值=1)x(foo).numpy()数组([-0., -0., 0., 5., 10.], dtype=float32)

对于您的情况,只需如下使用:

model = Sequential([Dense(64, input_shape=(32, ), activation = relu_advanced(threshold=2)),密集(32,激活= relu_advanced(阈值=2)),密集(2,激活=softmax")])

I am trying to change the threshold value of the activation function Relu while building my neural network.

So, the initial code was the one written below where the default value of the relu threshold is 0.

model = Sequential([
    Dense(n_inputs, input_shape=(n_inputs, ), activation = 'relu'),
    Dense(32, activation = 'relu'),
    Dense(2, activation='softmax')
])

However, Keras provides a function implementation of the same which can be reffered to here and adding a screenshot as well.

So, I changed my code to the following to pass a custom function only to get the following error.

from keras.activations import relu
model = Sequential([
    Dense(n_inputs, input_shape=(n_inputs, ), activation = relu(threshold = 2)), 
    Dense(32, activation = relu(threshold = 2)),
    Dense(2, activation='softmax')
])

Error: TypeError: relu() missing 1 required positional argument: 'x'

I understand the error is that I am not using x in the relu function but there is no way for me to pass something like that. The syntax expects me to write model.add(layers.Activation(activations.relu)) but then I won't be able to change the threshold. This is where I need a workaround or solution.

I then used the Layer implementation of the ReLU function which worked for me as written below but I want to find out if there is a way I can make the activation function implementation work because the layer is not always convenient to add and I want to make more modifications inside the Dense function.

Code which worked for me:-

from keras.layers import ReLU
model = Sequential([
    Dense(n_inputs, input_shape=(n_inputs, )),
    ReLU(threshold=4), 
    Dense(32),
    ReLU(threshold=4),
    Dense(2, activation='softmax')
])

解决方案

The error you're facing is reasonable. However, you can use the following trick on the relu function for your work. In this way, you define a function that takes necessary arguments e.g alpha, threshold etc, and in the function body, you define another function that calculates relu activations with these parameters, and the end returns to the upper function.

# help(tf.keras.backend.relu)
from tensorflow.keras import backend as K
def relu_advanced(alpha=0.0, max_value=None, threshold=0):        
    def relu_plus(x):
        return K.relu(x, 
                      alpha = tf.cast(alpha, tf.float32), 
                      max_value = max_value,
                      threshold= tf.cast(threshold, tf.float32))
    return relu_plus

Samples:

foo = tf.constant([-10, -5, 0.0, 5, 10], dtype = tf.float32)
tf.keras.activations.relu(foo).numpy()
array([ 0.,  0.,  0.,  5., 10.], dtype=float32)

x = relu_advanced(threshold=1)
x(foo).numpy()
array([-0., -0.,  0.,  5., 10.], dtype=float32)

For your case, simply use as follows:

model = Sequential([
    Dense(64, input_shape=(32, ), activation = relu_advanced(threshold=2)), 
    Dense(32, activation = relu_advanced(threshold=2)),
    Dense(2, activation='softmax')
])

这篇关于改变keras RELU激活函数的阈值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆