使用自动微分库计算任意张量的偏导数 [英] Using automatic differentiation libraries to compute partial derivatives of an arbitrary tensor

查看:222
本文介绍了使用自动微分库计算任意张量的偏导数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

(注意:这不是有关反向传播的问题.) 我正在尝试在GPU上使用PyTorch张量代替Numpy数组来解决非线性PDE问题.我想计算任意张量的偏导数,类似于中心有限差分的作用<​​a href ="https://docs.scipy.org/doc/numpy/reference/generation/numpy.gradient.html "rel =" noreferrer> numpy.gradient 函数.我有其他方法可以解决此问题,但是由于我已经在使用PyTorch,所以我想知道是否可以使用autograd模块(或者通常是任何其他自动分化模块)来执行此操作.

(Note: this is not a question about back-propagation.) I am trying so solve on a GPU a non-linear PDE using PyTorch tensors in place of Numpy arrays. I want to calculate the partial derivatives of an arbitrary tensor, akin to the action of the center finite-difference numpy.gradient function. I have other ways around this problem, but since I am already using PyTorch, I'm wondering if it is possible use the autograd module (or, in general, any other autodifferentiation module) to perform this action.

我创建了numpy.gradient函数的张量兼容版本-运行速度快得多.但是,也许有更优雅的方法可以做到这一点.我找不到其他任何资料可以解决这个问题,无论是表明这是可能的还是不可能的.也许这反映了我对自动分化算法的无知.

I have created a tensor-compatible version of the numpy.gradient function - which runs a lot faster. But perhaps there is a more elegant way of doing this. I can't find any other sources that address this question, either to show that it's possible or impossible; perhaps this reflects my ignorance with the autodifferentiation algorithms.

推荐答案

我本人也遇到过同样的问题:在对PDE进行数值求解时,我们需要访问所有的空间梯度(numpy.gradients函数可以为我们提供空间梯度).时间-可以使用自动微分来计算梯度,而不是使用有限差分或某种形式的梯度吗?

I've had this same question myself: when numerically solving PDEs, we need access to spatial gradients (which the numpy.gradients function can give us) all the time - could it be possible to use automatic differentiation to compute the gradients, instead of using finite-difference or some flavor of it?

我想知道是否可以使用autograd模块(或者通常是任何其他自动分化模块)来执行此操作."

答案是否定的:一旦您在空间或时间上离散问题,时间和空间就会变成具有网格状结构的离散变量,而不是您要输入的显式变量一些函数来计算PDE的解.

The answer is no: as soon as you discretize your problem in space or time, then time and space become discrete variables with a grid-like structure, and are not explicit variables which you feed into some function to compute the solution to the PDE.

例如,如果我想计算某些流体流u(x,t)的速度场,我将在空间和时间上离散化,并且我将具有u[:,:],其中的索引表示空间和时间上的位置.

For example, if I wanted to compute the velocity field of some fluid flow u(x,t), I would discretize in space and time, and I would have u[:,:] where the indices represent positions in space and time.

自动微分可以计算函数u(x,t)的导数.那么为什么不能在这里计算空间或时间导数呢?因为您离散化了您的问题.这意味着您没有任意x的u函数,而是在某些网格点具有u函数.您无法根据网格点的间距自动进行区分.

Automatic differentiation can compute the derivative of a function u(x,t). So why can't it compute the spatial or time derivative here? Because you've discretized your problem. This means you don't have a function for u for arbitrary x, but rather a function of u at some grid points. You can't differentiate automatically with respect to the spacing of the grid points.

据我所知,您编写的与张量兼容的函数可能是最好的选择.您可以在PyTorch论坛上看到类似的问题此处.或者,您可以做类似

As far as I can tell, the tensor-compatible function you've written is probably your best bet. You can see that a similar question has been asked in the PyTorch forums here and here. Or you could do something like

dx = x[:,:,1:]-x[:,:,:-1]

如果您不担心端点.

这篇关于使用自动微分库计算任意张量的偏导数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆