Pytorch 自定义激活函数? [英] Pytorch custom activation functions?

查看:31
本文介绍了Pytorch 自定义激活函数?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在 Pytorch 中实现自定义激活函数时遇到问题,例如 Swish.我应该如何在 Pytorch 中实现和使用自定义激活函数?

I'm having issues with implementing custom activation functions in Pytorch, such as Swish. How should I go about implementing and using custom activation functions in Pytorch?

推荐答案

四种可能性,具体取决于您要查找的内容.你需要问自己两个问题:

There are four possibilities depending on what you are looking for. You will need to ask yourself two questions:

Q1) 你的激活函数有可学习的参数吗?

Q1) Will your activation function have learnable parameters?

如果,您别无选择,只能将激活函数创建为 nn.Module 类,因为您需要存储这些权重.

If yes, you have no choice but to create your activation function as an nn.Module class because you need to store those weights.

如果,您可以随意创建一个普通函数或类,具体取决于您方便的内容.

If no, you are free to simply create a normal function, or a class, depending on what is convenient for you.

Q2) 您的激活函数可以表示为现有 PyTorch 函数的组合吗?

Q2) Can your activation function be expressed as a combination of existing PyTorch functions?

如果,您可以简单地将其编写为现有 PyTorch 函数的组合,而无需创建定义渐变的 backward 函数.

If yes, you can simply write it as a combination of existing PyTorch function and won't need to create a backward function which defines the gradient.

如果,您将需要手写渐变.

If no you will need to write the gradient by hand.

示例 1:Swish 函数

swish 函数 f(x) = x * sigmoid(x) 没有任何学习权重,可以完全使用现有的 PyTorch 函数编写,因此您可以简单地将其定义为函数:

The swish function f(x) = x * sigmoid(x) does not have any learned weights and can be written entirely with existing PyTorch functions, thus you can simply define it as a function:

def swish(x):
    return x * torch.sigmoid(x)

然后像使用 torch.relu 或任何其他激活函数一样简单地使用它.

and then simply use it as you would have torch.relu or any other activation function.

示例 2:学习斜率的 Swish

在这种情况下,您有一个学习参数,即斜率,因此您需要对其进行分类.

In this case you have one learned parameter, the slope, thus you need to make a class of it.

class LearnedSwish(nn.Module):
    def __init__(self, slope = 1):
        super().__init__()
        self.slope = slope * torch.nn.Parameter(torch.ones(1))

    def forward(self, x):
        return self.slope * x * torch.sigmoid(x)

示例 3:向后

如果你有一些东西需要创建自己的梯度函数,你可以看看这个例子:Pytorch:定义自定义函数

If you have something for which you need to create your own gradient function, you can look at this example: Pytorch: define custom function

这篇关于Pytorch 自定义激活函数?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆