PyTorch中的nn.functional()与nn.sequential()之间是否存在任何计算效率差异 [英] Are there any computational efficiency differences between nn.functional() Vs nn.sequential() in PyTorch

查看:383
本文介绍了PyTorch中的nn.functional()与nn.sequential()之间是否存在任何计算效率差异的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

以下是使用PyTorch中的nn.functional()模块的前馈网络

The following is a Feed-forward network using the nn.functional() module in PyTorch

import torch.nn as nn
import torch.nn.functional as F

class newNetwork(nn.Module):
    def __init__(self):
        super().__init__()
        self.fc1 = nn.Linear(784, 128)
        self.fc2 = nn.Linear(128, 64)
        self.fc3 = nn.Linear(64,10)

    def forward(self,x):
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = F.softmax(self.fc3(x))
        return x

model = newNetwork()
model

以下是使用nn.sequential()模块本质上构建相同对象的相同前馈.两者之间有什么区别?我什么时候可以使用一个而不是另一个?

The following is the same Feed-forward using nn.sequential() module to essentially build the same thing. What is the difference between the two and when would i use one instead of the other?

input_size = 784
hidden_sizes = [128, 64]
output_size = 10

建立前馈网络

 model = nn.Sequential(nn.Linear(input_size, hidden_sizes[0]),
                      nn.ReLU(),
                      nn.Linear(hidden_sizes[0], hidden_sizes[1]),
                      nn.ReLU(),
                      nn.Linear(hidden_sizes[1], output_size),
                      nn.Softmax(dim=1))
    print(model)

推荐答案

两者之间没有区别.后者可以说更简洁,更容易编写,并且纯目标(即无状态)函数(如ReLUSigmoid)的目标"版本的原因是允许它们在诸如nn.Sequential的结构中使用.

There is no difference between the two. The latter is arguably more concise and easier to write and the reason for "objective" versions of pure (ie non-stateful) functions like ReLU and Sigmoid is to allow their use in constructs like nn.Sequential.

这篇关于PyTorch中的nn.functional()与nn.sequential()之间是否存在任何计算效率差异的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆