tf.GradientTape() 返回无 [英] tf.GradientTape() return None

查看:43
本文介绍了tf.GradientTape() 返回无的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用 tf.GradientTape 计算梯度.当我尝试使用损失和 Model.variables (tf.keras.Model) 作为输入时,结果以 None 的数组形式返回我.我究竟做错了什么?我用的tensorflow版本是1.9.

I'm trying to calculate the gradient with tf.GradientTape. When I try to do it using as inputs the loss and Model.variables (tf.keras.Model) the result that returns me in an array of None. what am I doing wrong? The tensorflow version I use is 1.9.

Model = CubeValModel(TrainingCurves)

LearningRate = 0.0005
TrainOpe = tf.train.AdamOptimizer(LearningRate, name="MainTrainingOpe")

for i in range (5):
    with tf.GradientTape() as t:
        Predictions = tf.nn.softmax(Model.FinalFC, name="SoftmaxPredictions")
        Cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits=Predictions, labels=TrainingLabels, name="CrossEntropy")
        Loss = tf.reduce_mean(Cross_entropy, name="Loss")
        print (Loss)
        print (Model.variables)
        Gradients = t.gradient(Loss, Model.variables)
        print(Gradients)

输出:

tf.Tensor(0.84878147, shape=(), dtype=float32)

[<tf.Variable 'LayerBlock1/Weights1:0' shape=(1, 3, 1, 3) dtype=float32, numpy=

[None, None, None, None, None, None, None, None, None]

推荐答案

我假设您正在使用 TensorFlow 急切执行,是吗?如果我没记错的话,在 tf.GradientTape() 下,您应该调用计算模型的方法,而不是调用其成员之一.此计算执行将允许 t 确定稍后需要生成哪些梯度.我希望这会有所帮助

I assume you're using TensorFlow eager execution, an't you? If I'm not mistaken, under tf.GradientTape(), You should call the method that computes your model instead of calling one of its member. This computation execution will allow t to figure out which gradients it needs to generate later. I hope this helps

这篇关于tf.GradientTape() 返回无的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆