Tensorflow:没有为任何变量提供梯度 [英] Tensorflow: No gradients provided for any variable
本文介绍了Tensorflow:没有为任何变量提供梯度的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我是tensorflow
的新手,我正在构建网络,但无法计算/应用它的渐变.我收到错误消息:
I am new to tensorflow
and I am building a network but failing to compute/apply the gradients for it. I get the error:
ValueError: No gradients provided for any variable: ((None, tensorflow.python.ops.variables.Variable object at 0x1025436d0), ... (None, tensorflow.python.ops.variables.Variable object at 0x10800b590))
我尝试使用张量板图来查看是否存在无法追踪的东西图形并获得渐变,但我什么都看不到.
I tried using a tensorboard graph to see if there`s was something that made it impossible to trace the graph and get the gradients but I could not see anything.
这是代码的一部分:
sess = tf.Session()
X = tf.placeholder(type, [batch_size,feature_size])
W = tf.Variable(tf.random_normal([feature_size, elements_size * dictionary_size]), name="W")
target_probabilties = tf.placeholder(type, [batch_size * elements_size, dictionary_size])
lstm = tf.nn.rnn_cell.BasicLSTMCell(lstm_hidden_size)
stacked_lstm = tf.nn.rnn_cell.MultiRNNCell([lstm] * number_of_layers)
initial_state = state = stacked_lstm.zero_state(batch_size, type)
output, state = stacked_lstm(X, state)
pred = tf.matmul(output,W)
pred = tf.reshape(pred, (batch_size * elements_size, dictionary_size))
# instead of calculating this, I will calculate the difference between the target_W and the current W
cross_entropy = tf.nn.softmax_cross_entropy_with_logits(target_probabilties, pred)
cost = tf.reduce_mean(cross_entropy)
optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(cost)
sess.run(optimizer, feed_dict={X:my_input, target_probabilties:target_prob})
如果能解决此问题,我将不胜感激.
I will appreciate any help on figuring this out.
推荐答案
我始终使用tf.nn.softmax_cross_entropy_with_logits(),以便将logit作为第一个参数,将标签作为第二个参数.你可以试试吗?
I always have the tf.nn.softmax_cross_entropy_with_logits() used so that I have the logits as first argument and the labels as second. Can you try this?
这篇关于Tensorflow:没有为任何变量提供梯度的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文