标签等级(收到 1)应等于对数等级减 1(收到 4) [英] Rank of labels (received 1) should equal rank of logits minus 1 (received 4)
问题描述
我有一个名为 data_queue 的 tf.RandomShuffleQueue
定义如下:
self.data_queue = tf.RandomShuffleQueue(容量=1024,min_after_dequeue=21,dtypes=[tf.float32,tf.int32],形状=[[221, 221, 3], []],名称="数据队列")
我能够成功地将数据项加入其中.
出队操作定义如下:
[self.batch_images, self.batch_labels] = self.data_queue.dequeue_up_to(self.batchsize)
在上面的代码片段中,self.batchsize
是一个常数张量.
问题现在开始如下:
我想将这些直接推送到我的图表中.为了清楚起见,我的图表的第一层如下:
conv1 = tf.layers.conv2d(self.batch_images, filters=96, kernel_size=7,步幅=2,激活=tf.nn.relu,kernel_regularizer=tf.random_uniform_initializer,名称='conv1')
引用错误的最后几行是:
drop2 = tf.layers.dropout(fc2, name='drop2')cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(logits=drop2,标签=self.batch_labels,名称="cross_entropy_per_example")
我收到的错误是:
line 1709, in sparse_softmax_cross_entropy_with_logits(labels_static_shape.ndims, logits.get_shape().ndims))ValueError:等级不匹配:标签等级(收到 1)应等于对数等级减 1(收到 4).
当我检查
self.images
和self.labels
的tf.rank
时,我得到以下信息:>标签张量(等级:0",形状=(),dtype=float32)图片张量("Rank_1:0", shape=(), dtype=int32)
这是什么原因?
注意我不想使用 tf.placeholder
和 feed_dict
.我想将 self.data_queue
直接连接到图表.
我自己找到了解决方案.如果其他人将来需要帮助,请在此处发布.
卷积层的输出必须重新整形以保持批次大小相同.这很重要,因为否则完全连接的计算没有意义.不幸的是,tf.layers.dense
的文档 here 对此非常模糊,并且没有对自身进行适当的重塑.
我使用 tf.contrib.layers.flatten
进行了整形,效果非常好
I have a tf.RandomShuffleQueue
called data_queue defined as follows :
self.data_queue = tf.RandomShuffleQueue(capacity=1024,
min_after_dequeue=21,
dtypes=[tf.float32,
tf.int32],
shapes=[[221, 221, 3], []],
name="data_queue")
I am able to successfully enqueue data items into it.
The dequeue operation is defined as follows :
[self.batch_images, self.batch_labels] = self.data_queue.dequeue_up_to(self.batchsize)
In the above snippet self.batchsize
is a constant tensor.
The issue starts now as follows :
I want to push these directly to my graph. Just for clarity the first layer of my graph is as follows :
conv1 = tf.layers.conv2d(self.batch_images, filters=96, kernel_size=7,
strides=2,
activation=tf.nn.relu,
kernel_regularizer=tf
.random_uniform_initializer,
name='conv1')
The last few lines where the error is referenced are :
drop2 = tf.layers.dropout(fc2, name='drop2')
cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(
logits=drop2,
labels=self.batch_labels,
name="cross_entropy_per_example")
The error I receive is :
line 1709, in sparse_softmax_cross_entropy_with_logits
(labels_static_shape.ndims, logits.get_shape().ndims))
ValueError: Rank mismatch: Rank of labels (received 1) should equal rank of logits minus 1 (received 4).
When I checked the
tf.rank
ofself.images
andself.labels
, I got the following :Labels Tensor("Rank:0", shape=(), dtype=float32) Images Tensor("Rank_1:0", shape=(), dtype=int32)
What is the reason for it ?
NOTE I do not want to use tf.placeholder
and feed_dict
. I want to connect the self.data_queue
directly to the graph.
I found out the solution myself. Just posting it here if someone else needs help with this in the future.
The output of the convolutional layer has to be reshaped keeping batch size the same. This is important because otherwise the fully connected calculations make no sense.
Unfortunately, the documentation of tf.layers.dense
here is quite fuzzy on this and does not do appropriate reshaping itself.
I did the reshaping using tf.contrib.layers.flatten
and it worked like a charm
这篇关于标签等级(收到 1)应等于对数等级减 1(收到 4)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!