PyTorch如何计算二阶雅可比行列式? [英] PyTorch how to compute second order Jacobian?

查看:203
本文介绍了PyTorch如何计算二阶雅可比行列式?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个神经网络,正在计算向量数量 u .我想针对输入 x (单个元素)计算一阶和二阶jacobian.

I have a neural network that's computing a vector quantity u. I'd like to compute first and second-order jacobians with respect to the input x, a single element.

有人知道如何在PyTorch中做到这一点吗?下面是我项目中的代码段:

Would anybody know how to do that in PyTorch? Below, the code snippet from my project:

import torch
import torch.nn as nn

class PINN(torch.nn.Module):
    
    def __init__(self, layers:list):
        super(PINN, self).__init__()
        self.linears = nn.ModuleList([])
        for i, dim in enumerate(layers[:-2]):
            self.linears.append(nn.Linear(dim, layers[i+1]))
            self.linears.append(nn.ReLU())
        self.linears.append(nn.Linear(layers[-2], layers[-1]))
        
    def forward(self, x):
        for layer in self.linears:
            x = layer(x)
        return x

然后我实例化我的网络:

I then instantiate my network:

n_in = 1
units = 50
q = 500

pinn = PINN([n_in, units, units, units, q+1])
pinn

返回哪个

PINN(
  (linears): ModuleList(
    (0): Linear(in_features=1, out_features=50, bias=True)
    (1): ReLU()
    (2): Linear(in_features=50, out_features=50, bias=True)
    (3): ReLU()
    (4): Linear(in_features=50, out_features=50, bias=True)
    (5): ReLU()
    (6): Linear(in_features=50, out_features=501, bias=True)
  )
)

然后我计算FO和SO雅可比人

Then I compute both FO and SO jacobians

x = torch.randn(1, requires_grad=False)

u_x = torch.autograd.functional.jacobian(pinn, x, create_graph=True)
print("First Order Jacobian du/dx of shape {}, and features\n{}".format(u_x.shape, u_x)

u_xx = torch.autograd.functional.jacobian(lambda _: u_x, x)
print("Second Order Jacobian du_x/dx of shape {}, and features\n{}".format(u_xx.shape, u_xx)

返回

First Order Jacobian du/dx of shape torch.Size([501, 1]), and features
tensor([[-0.0310],
        [ 0.0139],
        [-0.0081],
        [-0.0248],
        [-0.0033],
        [ 0.0013],
        [ 0.0040],
        [ 0.0273],
        ...
        [-0.0197]], grad_fn=<ViewBackward>)

Second Order Jacobian du/dx of shape torch.Size([501, 1, 1]), and features
tensor([[[0.]],

        [[0.]],

        [[0.]],

        [[0.]],

        ...

        [[0.]]])

如果 u_xx 不依赖于 x ,是不是 None 向量?

Should not u_xx be a None vector if it didn't depend on x?

预先感谢

推荐答案

因此,正如@jodag在他的评论中所述, ReLU 为空或线性,其梯度是恒定的( 0除外),这是一种罕见的事件),因此其二阶导数为零.我将激活功能更改为 Tanh ,最终使我可以计算两次jacobian.

So as @jodag mentioned in his comment, ReLU being null or linear, its gradient is constant (except on 0, which is a rare event), so its second-order derivative is zero. I changed the activation function to Tanh, which finally allows me to compute the jacobian twice.

最终代码是

import torch
import torch.nn as nn

class PINN(torch.nn.Module):
    
    def __init__(self, layers:list):
        super(PINN, self).__init__()
        self.linears = nn.ModuleList([])
        for i, dim in enumerate(layers[:-2]):
            self.linears.append(nn.Linear(dim, layers[i+1]))
            self.linears.append(nn.Tanh())
        self.linears.append(nn.Linear(layers[-2], layers[-1]))
        
    def forward(self, x):
        for layer in self.linears:
            x = layer(x)
        return x
        
    def compute_u_x(self, x):
        self.u_x = torch.autograd.functional.jacobian(self, x, create_graph=True)
        self.u_x = torch.squeeze(self.u_x)
        return self.u_x
    
    def compute_u_xx(self, x):
        self.u_xx = torch.autograd.functional.jacobian(self.compute_u_x, x)
        self.u_xx = torch.squeeze(self.u_xx)
        return self.u_xx

然后在 x.require_grad 设置为 True PINN 实例上调用 compute_u_xx(x)获得我在那里.尽管如何理解如何摆脱 torch.autograd.functional.jacobian 引入的无用尺寸,仍然需要理解...

Then calling compute_u_xx(x) on an instance of PINN with x.require_grad set to True gets me there. How to get rid of useless dimensions introduced by torch.autograd.functional.jacobian remains to be understood though...

这篇关于PyTorch如何计算二阶雅可比行列式?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆