pytorch grad在.backward()之后为None [英] pytorch grad is None after .backward()

查看:715
本文介绍了pytorch grad在.backward()之后为None的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚刚在 Python 3.7.2 (macOS)上安装了 torch-1.0.0 ,然后尝试

I just installed torch-1.0.0 on Python 3.7.2 (macOS), and trying the tutorial, but the following code:

import torch
x = torch.ones(2, 2, requires_grad=True)
y = x + 2
z = y * y * 3
out = z.mean()
out.backward()
print(out.grad)

打印 None ,这不是预期的结果.

prints None which is not what's expected.

出什么问题了?

推荐答案

这是预期的结果.

.backward 仅在叶节点中累积渐变. out 不是叶节点,因此grad为None.

.backward accumulate gradient only in the leaf nodes. out is not a leaf node, hence grad is None.

autograd.backward 也做同样的事情

autograd.grad 可用于查找任何张量到任何张量的梯度.因此,如果执行 autograd.grad(输出,输出),则会得到(tensor(1.),)作为预期的输出.

autograd.grad can be used to find the gradient of any tensor w.r.t to any tensor. So if you do autograd.grad (out, out) you get (tensor(1.),) as output which is as expected.

参考:

这篇关于pytorch grad在.backward()之后为None的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆