Keras功能性API和激活 [英] Keras Functional API and activations

查看:70
本文介绍了Keras功能性API和激活的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

尝试在Keras Functional API上使用激活时遇到问题.我最初的目标是在relu和Leaky Relu之间进行选择,因此我想到了以下代码:

I'm having problems when trying to use activations with Keras Functional API. My initial goal was to have choice between relu and leaky relu, so I came up with the following piece of code:

def activation(x, activation_type):
    if activation_type == 'leaky_relu':
        return activations.relu(x, alpha=0.3)
    else:
        return activations.get(activation_type)(x)


# building the model

inputs = keras.Input(input_shape, dtype='float32')
x = Conv2D(filters, (3, 3), padding='same')(inputs)
x = activation(x, 'relu')

,但是类似这样会出现错误:AttributeError: 'Tensor' object has no attribute '_keras_history'.我发现这可能表明我在模型中的输入和输出未连接.

but something like this gives error: AttributeError: 'Tensor' object has no attribute '_keras_history'. I found out that it may indicate that my inputs and outputs in Model are not connected.

keras.advanced_activations是在功能性API中实现此类功能的唯一方法吗?

Is keras.advanced_activations the only way to achieve functionality like this in functional API?

这是起作用的激活功能的版本:

here's the version of activation function that worked:

    def activation(self, x):
        if self.activation_type == 'leaky_relu':
            act = lambda x: activations.relu(x, alpha=0.3)
        else:
            act = activations.get(self.activation_type)
        return layers.Activation(act)(x)

推荐答案

您要通过激活图层将激活添加到模型中.当前,您正在添加的不是Keras Layer的对象,这会导致您的错误. (在Keras中,层名称始终以大写字母开头).尝试这样的事情(最小示例):

You want to add an activation to your model by means of an activation layer. Currently, you are adding an object that is not a Keras Layer, which is causing your error. (In Keras, layer names always start with a capital). Try something like this (minimal example):

from keras.layers import Input, Dense, Activation
from keras import activations

def activation(x, activation_type):
    if activation_type == 'leaky_relu':
        return activations.relu(x, alpha=0.3)
    else:
        return activations.get(activation_type)(x)


# building the model
inputs = Input((5,), dtype='float32')
x = Dense(128)(inputs)
# Wrap inside an Activation layer
x = Activation(lambda x: activation(x, 'sigmoid'))(x)

这篇关于Keras功能性API和激活的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆