在Tensorflow中计算jacobian矩阵 [英] Computing jacobian matrix in Tensorflow

查看:408
本文介绍了在Tensorflow中计算jacobian矩阵的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想通过Tensorflow计算Jacobian矩阵.

I want to calculate Jacobian matrix by Tensorflow.

我所拥有的:

def compute_grads(fn, vars, data_num):
    grads = []
    for n in range(0, data_num):
        for v in vars:
            grads.append(tf.gradients(tf.slice(fn, [n, 0], [1, 1]), v)[0])
    return tf.reshape(tf.stack(grads), shape=[data_num, -1])

fn是损失函数,vars都是可训练变量,data_num是许多数据.

fn is a loss function, vars are all trainable variables, and data_num is a number of data.

但是,如果我们增加数据数量,则需要花费大量时间来运行函数compute_grads. 有什么想法吗?

But if we increase the number of data, it takes tremendous time to run the function compute_grads. Any ideas?

推荐答案

假定XY是Tensorflow张量,并且Y取决于X:

Assuming that X and Y are Tensorflow tensors and that Y depends on X:

from tensorflow.python.ops.parallel_for.gradients import jacobian
J=jacobian(Y,X)

结果的形状为Y.shape + X.shape,并且提供了Y的每个元素相对于X的每个元素的偏导数.

The result has the shape Y.shape + X.shape and provides the partial derivative of each element of Y with respect to each element of X.

这篇关于在Tensorflow中计算jacobian矩阵的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆