Pytorch ValueError:优化器有一个空的参数列表 [英] Pytorch ValueError: optimizer got an empty parameter list

查看:71
本文介绍了Pytorch ValueError:优化器有一个空的参数列表的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当尝试创建神经网络并使用Pytorch对其进行优化时,我得到

When trying to create a neural network and optimize it using Pytorch, I am getting

ValueError:优化器的参数列表为空

ValueError: optimizer got an empty parameter list

这是代码.

import torch.nn as nn
import torch.nn.functional as F
from os.path import dirname
from os import getcwd
from os.path import realpath
from sys import argv

class NetActor(nn.Module):

    def __init__(self, args, state_vector_size, action_vector_size, hidden_layer_size_list):
        super(NetActor, self).__init__()
        self.args = args

        self.state_vector_size = state_vector_size
        self.action_vector_size = action_vector_size
        self.layer_sizes = hidden_layer_size_list
        self.layer_sizes.append(action_vector_size)

        self.nn_layers = []
        self._create_net()

    def _create_net(self):
        prev_layer_size = self.state_vector_size
        for next_layer_size in self.layer_sizes:
            next_layer = nn.Linear(prev_layer_size, next_layer_size)
            prev_layer_size = next_layer_size
            self.nn_layers.append(next_layer)

    def forward(self, torch_state):
        activations = torch_state
        for i,layer in enumerate(self.nn_layers):
            if i != len(self.nn_layers)-1:
                activations = F.relu(layer(activations))
            else:
                activations = layer(activations)

        probs = F.softmax(activations, dim=-1)
        return probs

然后拨打电话

        self.actor_nn = NetActor(self.args, 4, 2, [128])
        self.actor_optimizer = optim.Adam(self.actor_nn.parameters(), lr=args.learning_rate)

给出了非常有用的错误

ValueError:优化器的参数列表为空

ValueError: optimizer got an empty parameter list

我很难理解网络定义中到底是什么使网络具有参数.

I find it hard to understand what exactly in the network's definition makes the network have parameters.

我正在关注并扩展在 Pytorch的教程代码中找到的示例.

我真的无法分辨我的代码与他们的代码之间的区别,这使我认为代码没有要优化的参数.

I can't really tell the difference between my code and theirs that makes mine think it has no parameters to optimize.

如何使我的网络具有如链接示例所示的参数?

推荐答案

您的NetActor不直接存储任何容器.
具体来说,将self.nn_layers设为 nn.ModuleList 而不是简单的列表应该可以解决您的问题:

Your NetActor does not directly store any nn.Parameter. Moreover, all other layers it eventually uses in forward are stored as a simple list is self.nn_layers.
If you want self.actor_nn.parameters() to know that the items stored in the list self.nn_layers may contain trainable parameters, you should work with containers.
Specifically, making self.nn_layers to be a nn.ModuleList instead of a simple list should solve your problem:

self.nn_layers = nn.ModuleList()

这篇关于Pytorch ValueError:优化器有一个空的参数列表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆