pytorch自定义激活功能? [英] Pytorch custom activation functions?

查看:169
本文介绍了pytorch自定义激活功能?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在Pytorch中实现自定义激活功能时遇到问题,例如Swish.我应该如何在Pytorch中实现和使用自定义激活功能?

I'm having issues with implementing custom activation functions in Pytorch, such as Swish. How should I go about implementing and using custom activation functions in Pytorch?

推荐答案

根据您要寻找的内容,有四种可能性.您将需要问自己两个问题:

There are four possibilities depending on what you are looking for. You will need to ask yourself two questions:

问题1)您的激活功能是否具有可学习的参数?

Q1) Will your activation function have learnable parameters?

如果,则没有选择将激活函数创建为 nn.Module 类,因为您需要存储这些权重.

If yes, you have no choice to create your activation function as an nn.Module class because you need to store those weights.

如果,则可以根据自己的方便随意创建普通函数或类.

If no, you are free to simply create a normal function, or a class, depending on what is convenient for you.

第二季度)能否将您的激活功能表示为现有PyTorch功能的组合?

Q2) Can your activation function be expressed as a combination of existing PyTorch functions?

如果,则可以简单地将其编写为现有PyTorch函数的组合,而无需创建定义梯度的 backward 函数.

If yes, you can simply write it as a combination of existing PyTorch function and won't need to create a backward function which defines the gradient.

如果,则需要手动编写渐变.

If no you will need to write the gradient by hand.

示例1:旋转功能

swish函数 f(x)= x * sigmoid(x)没有学习的权重,可以完全用现有PyTorch函数编写,因此您可以简单地将其定义为一个函数:

The swish function f(x) = x * sigmoid(x) does not have any learned weights and can be written entirely with existing PyTorch functions, thus you can simply define it as a function:

def swish(x):
    return x * torch.sigmoid(x)

,然后像使用 torch.relu 或任何其他激活功能一样简单地使用它.

and then simply use it as you would have torch.relu or any other activation function.

示例2:以学到的坡度挥舞

在这种情况下,您有一个学习的参数,即斜率,因此您需要对其进行分类.

In this case you have one learned parameter, the slope, thus you need to make a class of it.

class LearnedSwish(nn.Module):
    def __init__(self, slope = 1):
        super().__init__()
        self.slope = slope * torch.nn.Parameter(torch.ones(1))

    def forward(self, x):
        return self.slope * x * torch.sigmoid(x)

示例3:向后

如果您需要创建自己的渐变函数,可以查看以下示例: Pytorch:定义自定义功能

If you have something for which you need to create your own gradient function, you can look at this example: Pytorch: define custom function

这篇关于pytorch自定义激活功能?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆