如何在 Pytorch 中使用 torch.nn.Sequential 实现我自己的 ResNet? [英] How to implement my own ResNet with torch.nn.Sequential in Pytorch?

查看:53
本文介绍了如何在 Pytorch 中使用 torch.nn.Sequential 实现我自己的 ResNet?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想实现一个 ResNet 网络(或者更确切地说,残差块),但我真的希望它采用顺序网络形式.

I want to implement a ResNet network (or rather, residual blocks) but I really want it to be in the sequential network form.

我所说的顺序网络形式的意思如下:

What I mean by sequential network form is the following:

## mdl5, from cifar10 tutorial
mdl5 = nn.Sequential(OrderedDict([
    ('pool1', nn.MaxPool2d(2, 2)),
    ('relu1', nn.ReLU()),
    ('conv1', nn.Conv2d(3, 6, 5)),
    ('pool1', nn.MaxPool2d(2, 2)),
    ('relu2', nn.ReLU()),
    ('conv2', nn.Conv2d(6, 16, 5)),
    ('relu2', nn.ReLU()),
    ('Flatten', Flatten()),
    ('fc1', nn.Linear(1024, 120)), # figure out equation properly
    ('relu4', nn.ReLU()),
    ('fc2', nn.Linear(120, 84)),
    ('relu5', nn.ReLU()),
    ('fc3', nn.Linear(84, 10))
]))

当然,NN 乐高积木是ResNet".

but of course with the NN lego blocks being "ResNet".

我知道等式是这样的:

但我不知道如何在 Pytorch AND Sequential 中做到这一点.顺序对我来说很重要!

but I am not sure how to do it in Pytorch AND Sequential. Sequential is key for me!

交叉发布:

推荐答案

你不能单独使用 torch.nn.Sequential 因为它需要操作,顾名思义,按顺序进行,而你的平行.

You can't do it solely using torch.nn.Sequential as it requires operations to go, as the name suggests, sequentially, while yours are parallel.

原则上,您可以像这样轻松构建自己的:

You could, in principle, construct your own block really easily like this:

import torch

class ResNet(torch.nn.Module):
    def __init__(self, module):
        super().__init__()
        self.module = module

    def forward(self, inputs):
        return self.module(inputs) + inputs

哪个可以使用这样的东西:

Which one can use something like this:

model = torch.nn.Sequential(
    torch.nn.Conv2d(3, 32, kernel_size=7),
    # 32 filters in and out, no max pooling so the shapes can be added
    ResNet(
        torch.nn.Sequential(
            torch.nn.Conv2d(32, 32, kernel_size=3),
            torch.nn.ReLU(),
            torch.nn.BatchNorm2d(32),
            torch.nn.Conv2d(32, 32, kernel_size=3),
            torch.nn.ReLU(),
            torch.nn.BatchNorm2d(32),
        )
    ),
    # Another ResNet block, you could make more of them
    # Downsampling using maxpool and others could be done in between etc. etc.
    ResNet(
        torch.nn.Sequential(
            torch.nn.Conv2d(32, 32, kernel_size=3),
            torch.nn.ReLU(),
            torch.nn.BatchNorm2d(32),
            torch.nn.Conv2d(32, 32, kernel_size=3),
            torch.nn.ReLU(),
            torch.nn.BatchNorm2d(32),
        )
    ),
    # Pool all the 32 filters to 1, you may need to use `torch.squeeze after this layer`
    torch.nn.AdaptiveAvgPool2d(1),
    # 32 10 classes
    torch.nn.Linear(32, 10),
)

通常被忽视的事实(当涉及到浅层网络时没有真正的后果)是跳过连接应该没有任何非线性,如ReLU或卷积层,这就是您可以在上面看到(来源:深度残差网络中的身份映射).

Fact being usually overlooked (without real consequences when it comes to shallowe networks) is that skip connection should be left without any nonlinearities like ReLU or convolutional layers and that's what you can see above (source: Identity Mappings in Deep Residual Networks).

这篇关于如何在 Pytorch 中使用 torch.nn.Sequential 实现我自己的 ResNet?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆