Pytorch Autograd:什么是运行时错误“只能为标量输出隐式创建grad"?吝啬的 [英] Pytorch Autograd: what does runtime error "grad can be implicitly created only for scalar outputs" mean

查看:106
本文介绍了Pytorch Autograd:什么是运行时错误“只能为标量输出隐式创建grad"?吝啬的的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试深入了解 Pytorch autograd;我想通过如下 sigmoid 函数观察一个简单张量的梯度:

I am trying to understand Pytorch autograd in depth; I would like to observe the gradient of a simple tensor after going through a sigmoid function as below:

import torch
from torch import autograd 

D = torch.arange(-8, 8, 0.1, requires_grad=True)

with autograd.set_grad_enabled(True):
    S = D.sigmoid()
S.backward()

我的目标是获得 D.grad() 但即使在调用它之前我也收到运行时错误:

My goal is to get D.grad() but even before calling it I get the runtime error:

RuntimeError: grad can be implicitly created only for scalar outputs

我看到另一个帖子 有类似的问题,但那边的答案不适用于我的问题.谢谢

I see another post with similar question but the answer over there is not applied to my question. Thanks

推荐答案

该错误意味着您只能在幺正/标量张量上运行 .backward(不带参数).IE.具有单个元素的张量.

The error means you can only run .backward (with no arguments) on a unitary/scalar tensor. I.e. a tensor with a single element.

例如,你可以这样做

T = torch.sum(S)
T.backward()

因为 T 将是一个标量输出.

since T would be a scalar output.

我在这个答案.

这篇关于Pytorch Autograd:什么是运行时错误“只能为标量输出隐式创建grad"?吝啬的的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆