无法使用训练好的Tensorflow模型 [英] unable to use Trained Tensorflow model
问题描述
我是深度学习和Tensorflow的新手。我将预训练的张量流 inceptionv3 模型重新训练为 saved_model.pb ,以识别不同类型的图像,但是当我尝试将fie与以下代码一起使用时。
I am new to Deep Learning and Tensorflow. I retrained a pretrained tensorflow inceptionv3 model as saved_model.pb to recognize different type of images but when I tried to use the fie with below code.
with tf.Session() as sess:
with tf.gfile.FastGFile("tensorflow/trained/saved_model.pb",'rb') as f:
graph_def = tf.GraphDef()
tf.Graph.as_graph_def()
graph_def.ParseFromString(f.read())
g_in=tf.import_graph_def(graph_def)
LOGDIR='/log'
train_writer=tf.summary.FileWriter(LOGDIR)
train_writer.add_graph(sess.graph)
它给了我这个错误-
File "testing.py", line 7, in <module>
graph_def.ParseFromString(f.read())
google.protobuf.message.DecodeError: Error parsing message
$ b $时出错b
我尝试了许多解决方案,可以在 tensorflow / python / tools 中找到此问题和模块,该模块使用 graph_def.ParseFromString(f.read())函数给我同样的错误。请告诉我如何解决此问题或告诉我可以避免 ParseFromString(f.read())函数的方式。任何帮助,将不胜感激。谢谢!
I tried many solution I can find for this problem and modules in tensorflow/python/tools which uses the graph_def.ParseFromString(f.read()) function are giving me same error. Please tell me how to solve this or tell me the way in which I can avoid ParseFromString(f.read()) function. Any help would be appreciated. Thank you!
推荐答案
我假设您使用 tf.saved_model.Builder
TensorFlow ,在这种情况下,您可以执行以下操作:
I am assuming that you saved your trained model using tf.saved_model.Builder
provided by TensorFlow, in which case you could possibly do something like:
export_path = './path/to/saved_model.pb'
# We start a session using a temporary fresh Graph
with tf.Session(graph=tf.Graph()) as sess:
'''
You can provide 'tags' when saving a model,
in my case I provided, 'serve' tag
'''
tf.saved_model.loader.load(sess, ['serve'], export_path)
graph = tf.get_default_graph()
# print your graph's ops, if needed
print(graph.get_operations())
'''
In my case, I named my input and output tensors as
input:0 and output:0 respectively
'''
y_pred = sess.run('output:0', feed_dict={'input:0': X_test})
在这里提供更多背景信息,这就是我保存可按上述方式加载的模型的方式。
To give some more context here, this is how I saved my model which can be loaded as above.
x = tf.get_default_graph().get_tensor_by_name('input:0')
y = tf.get_default_graph().get_tensor_by_name('output:0')
export_path = './models/'
builder = tf.saved_model.builder.SavedModelBuilder(export_path)
signature = tf.saved_model.predict_signature_def(
inputs={'input': x}, outputs={'output': y}
)
# using custom tag instead of: tags=[tf.saved_model.tag_constants.SERVING]
builder.add_meta_graph_and_variables(sess=obj.sess,
tags=['serve'],
signature_def_map={'predict': signature})
builder.save()
此会将您的protobuf('saved_model.pb')保存在上述文件夹(此处为 models)中,然后可以按照上述说明进行加载。
This will save your protobuf ('saved_model.pb') in the said folder ('models' here) which can then be loaded as stated above.
这篇关于无法使用训练好的Tensorflow模型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!