如何在 Tensorflow 2.0 中打印 tensorflow.python.framework.ops.Tensor 的值? [英] How to print value of tensorflow.python.framework.ops.Tensor in Tensorflow 2.0?

查看:303
本文介绍了如何在 Tensorflow 2.0 中打印 tensorflow.python.framework.ops.Tensor 的值?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的代码中有几个张量,需要获取这些张量的值.这是其中之一.如何打印张量 OA 的值?

I have a few tensors in my code and and need to get the values of those tensors. This is one them. How to print the values of tensor OA?

Input:OA
Output: <tf.Tensor 'Sum_1:0' shape=(1, 600) dtype=float32>

Input:type(OA)
Output: tensorflow.python.framework.ops.Tensor

我尝试了所有可用的函数,如 tf.print()、eval()、tensor.numpy().他们都没有在 Tensorflow 2.0 中为我工作.似乎它们仅适用于EagerTensor"而不适用于ops.Tensor".

I have tried all the available functions like tf.print(), eval(), tensor.numpy(). None of them worked for me in Tensorflow 2.0. It seems they work only for 'EagerTensor' and not for 'ops.Tensor'.

1) OA.eval(session=sess)错误:ValueError:无法使用给定的会话来评估张量:张量的图形与会话的图形不同.

1) OA.eval(session=sess) Error: ValueError: Cannot use the given session to evaluate tensor: the tensor's graph is different from the session's graph.

2) tf.print(OA)输出:

2) tf.print(OA) Output:

3) 打印 (OA.numpy())输出:AttributeError: 'Tensor' 对象没有属性 'numpy'

3) print (OA.numpy()) Output: AttributeError: 'Tensor' object has no attribute 'numpy'

有没有什么办法可以把 ops.Tensor 转换成 EagerTensor 来试试上面的功能?或者有没有其他选项可以打印 ops.Tensor 的值.请指教.

Is there any way to convert ops.Tensor to EagerTensor to try the above functions? Or is there any other option to print the values of ops.Tensor. Please advise.

--添加最少的代码来重现 TF2.0 中的示例 ops.Tensor.

!pip install tensorflow==2.0.0
tf.__version__

import tensorflow as tf
from keras.layers import Dense, Conv1D, MaxPooling1D, Flatten, Dropout, Input, Embedding, Bidirectional, LSTM
from tensorflow.keras import regularizers

EMBEDDING_DIM = 300
max_length = 120
batch_size = 512
vocab_size = 1000
units = 300

from keras.layers import Dense, Conv1D, MaxPooling1D, Flatten, Dropout, Input, Embedding, Bidirectional, LSTM
from tensorflow.keras import regularizers

input_text = tf.keras.Input(shape= (max_length), batch_size=batch_size)

embedding_layer = tf.keras.layers.Embedding(vocab_size, EMBEDDING_DIM, input_length =max_length, name="Embedding_Layer_1")
embedding_sequence = embedding_layer(input_text)

HQ = tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(units,recurrent_dropout=0.5,kernel_regularizer=regularizers.l2(0.001),return_sequences=True,name='Bidirectional_1'))(embedding_sequence)
HQ = tf.keras.layers.Bidirectional(tf.keras.layers.LSTM(units,recurrent_dropout=0.5,kernel_regularizer=regularizers.l2(0.001),name='Bidirectional_2'))(HQ)

print (HQ)

输出:Tensor("bidirectional_3/concat:0", shape=(512, 600), dtype=float32)

Output: Tensor("bidirectional_3/concat:0", shape=(512, 600), dtype=float32)

类型(总部)

输出:tensorflow.python.framework.ops.Tensor

Output: tensorflow.python.framework.ops.Tensor

如何检查这个张量的实际值?

How to check the actual values of this tensor?

推荐答案

在您打印 HQ 时,您的图表不完整.您需要完成模型创建.大概是这样的

Your graph is not complete at the point you are printing HQ. You need to complete the model creation. Presumably something like

output = tf.keras.layers.xyz()(HQ)
model = tf.keras.models.Model(input_text, output)

打印中间层的技巧是让它成为输出.您可以暂时将其作为现有模型的附加输出,或者只是制作一个新模型.

The trick to print an intermediate layer is to just make it an output. You can make it an additional output of your existing model temporarily, or just make a new model.

inspection_model = tf.keras.models.Model(input_text, [output, HQ])

现在对您的检查模型运行推理以获取中间激活 HQ 的值.

now run inference on your inspection_model to get the value of the intermediate activation HQ.

print(inspection_model(xyz))

这篇关于如何在 Tensorflow 2.0 中打印 tensorflow.python.framework.ops.Tensor 的值?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆