使用Autograd的偏导数 [英] Partial Derivative using Autograd

查看:207
本文介绍了使用Autograd的偏导数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个接受多变量参数x的函数.这里x = [x1,x2,x3].假设我的函数如下所示: f(x,T)= np.dot(x,T)+ np.exp(np.dot(x,T),其中T是常数.

I have a function that takes in a multivariate argument x. Here x = [x1,x2,x3]. Let's say my function looks like: f(x,T) = np.dot(x,T) + np.exp(np.dot(x,T) where T is a constant.

我对找到df/dx1,df/dx2和df/dx3函数很感兴趣.

I am interested in finding df/dx1, df/dx2 and df/dx3 functions.

我使用scipy diff取得了一些成功,但是我有点怀疑,因为它使用了数值差异.昨天,我的同事向我指出了Autograd (github).由于它似乎是一个受欢迎的软件包,因此我希望这里的人知道如何使用此软件包进行部分区分.我对该库的初步测试表明,grad函数仅对第一个参数进行微分.我不确定如何将其扩展到其他论点.任何帮助将不胜感激.

I have achieved some success using scipy diff, but I am a bit skeptical because it uses numerical differences. Yesterday, my colleague pointed me to Autograd (github). Since it seems to be a popular package, I am hoping someone here knows how to get partial differentiation using this package. My initial tests with this library indicates that the grad function only takes differentiation with respect to the first argument. I am not sure how to extend it to other arguments. Any help would be greatly appreciated.

谢谢.

推荐答案

我在autograd源代码中找到了grad函数的以下描述:

I found the following description of the grad function in the autograd source code:

def grad(fun, x)
"Returns a function which computes the gradient of `fun` with
respect to positional argument number `argnum`. The returned
function takes the same arguments as `fun`, but returns the
gradient instead. The function `fun`should be scalar-valued. The
gradient has the same type as the argument."

所以

def h(x,t):
    return np.dot(x,t) + np.exp(np.dot(x,t))
h_x = grad(h,0) # derivative with respect to x
h_t = grad(h,1) # derivative with respect to t

还请确保使用autograd随附的numpy libaray

Also make sure to use the numpy libaray that comes with autograd

import autograd.numpy as np

代替

import numpy as np

以使用所有numpy函数.

in order to make use of all numpy functions.

这篇关于使用Autograd的偏导数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆