从Python导出Tensorflow图以在C ++中使用 [英] Export Tensorflow graphs from Python for use in C++

查看:221
本文介绍了从Python导出Tensorflow图以在C ++中使用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

究竟应该如何导出python模型以在c ++中使用?

Exactly how should python models be exported for use in c++?

我正在尝试执行类似于本教程的操作:
https://www.tensorflow.org/versions/r0.8/tutorials/image_recognition /index.html

I'm trying to do something similar to this tutorial: https://www.tensorflow.org/versions/r0.8/tutorials/image_recognition/index.html

我正在尝试将自己的TF模型导入c ++ API中,而不是最初的TF模型中。我调整了输入大小和路径,但是奇怪的错误不断弹出。我整天都在阅读堆栈溢出和其他论坛,但无济于事。

I'm trying to import my own TF model in the c++ API in stead of the inception one. I adjusted input size and the paths, but strange errors keep popping up. I spent all day reading stack overflow and other forums but to no avail.

我尝试了两种方法来导出图形。

I've tried two methods for exporting the graph.

方法1:元符号。

...loading inputs, setting up the model, etc....

sess = tf.InteractiveSession()
sess.run(tf.initialize_all_variables())


for i in range(num_steps):  
  x_batch, y_batch = batch(50)  
  if i%10 == 0:
        train_accuracy = accuracy.eval(feed_dict={
        x:x_batch, y_: y_batch, keep_prob: 1.0})
        print("step %d, training accuracy %g"%(i, train_accuracy))
  train_step.run(feed_dict={x: x_batch, y_: y_batch, keep_prob: 0.5})

print("test accuracy %g"%accuracy.eval(feed_dict={
    x: features_test, y_: labels_test, keep_prob: 1.0}))

saver = tf.train.Saver(tf.all_variables())
checkpoint = 
   '/home/sander/tensorflow/tensorflow/examples/cat_face/data/model.ckpt'
    saver.save(sess, checkpoint)

   tf.train.export_meta_graph(filename=
   '/home/sander/tensorflow/tensorflow/examples/cat_face/data/cat_graph.pb',  
    meta_info_def=None,
    graph_def=sess.graph_def,
    saver_def=saver.restore(sess, checkpoint),
    collection_list=None, as_text=False)

尝试运行程序时方法1产生以下错误:

Method 1 yields the following error when trying to run the program:

[libprotobuf ERROR 
google/protobuf/src/google/protobuf/wire_format_lite.cc:532] String field 
'tensorflow.NodeDef.op' contains invalid UTF-8 data when parsing a protocol 
buffer. Use the 'bytes' type if you intend to send raw bytes. 
E tensorflow/examples/cat_face/main.cc:281] Not found: Failed to load 
compute graph at 'tensorflow/examples/cat_face/data/cat_graph.pb'

我还尝试了另一种导出图形的方法:

I also tried another method of exporting the graph:

方法2: write_graph:

Method 2: write_graph:

tf.train.write_graph(sess.graph_def, 
'/home/sander/tensorflow/tensorflow/examples/cat_face/data/', 
'cat_graph.pb', as_text=False)

此版本实际上似乎加载了某些内容,但是我收到有关未初始化变量的错误:

This version actually seems to load something, but I get an error about variables not being initialized:

Running model failed: Failed precondition: Attempting to use uninitialized  
value weight1
[[Node: weight1/read = Identity[T=DT_FLOAT, _class=["loc:@weight1"], 
_device="/job:localhost/replica:0/task:0/cpu:0"](weight1)]]


推荐答案

首先,您需要使用以下命令来图形定义文件:

At first, you need to graph definition to file by using following command

with tf.Session() as sess:
//Build network here 
tf.train.write_graph(sess.graph.as_graph_def(), "C:\\output\\", "mymodel.pb")

然后,使用保护程序保存模型

Then, save your model by using saver

saver = tf.train.Saver(tf.global_variables()) 
saver.save(sess, "C:\\output\\mymodel.ckpt")

然后,您将有2个文件在您的输出中,mymodel.ckpt,mymodel.pb

Then, you will have 2 files at your output, mymodel.ckpt, mymodel.pb

此处,并在C:\output\中运行以下命令。如果与您不同,请更改输出节点名称。

Download freeze_graph.py from here and run following command in C:\output\. Change output node name if it is different for you.


python Frozen_graph.py --input_graph mymodel.pb --input_checkpoint mymodel.ckpt --output_node_names softmax / Reshape_1 --output_graph mymodelforc.pb

python freeze_graph.py --input_graph mymodel.pb --input_checkpoint mymodel.ckpt --output_node_names softmax/Reshape_1 --output_graph mymodelforc.pb

您可以直接从C使用mymodelforc.pb。

You can use mymodelforc.pb directly from C.

您可以使用以下C代码加载原型文件

You can use following C code to load the proto file

#include "tensorflow/core/public/session.h"
#include "tensorflow/core/platform/env.h"
#include "tensorflow/cc/ops/image_ops.h"

Session* session;
NewSession(SessionOptions(), &session);

GraphDef graph_def;
ReadBinaryProto(Env::Default(), "C:\\output\\mymodelforc.pb", &graph_def);

session->Create(graph_def);

现在您可以使用会话进行推理了。

Now you can use session for inference.

您可以按以下方式应用推断参数:

You can apply inference parameter as following:

// Same dimension and type as input of your network
tensorflow::Tensor input_tensor(tensorflow::DT_FLOAT, tensorflow::TensorShape({ 1, height, width, channel }));
std::vector<tensorflow::Tensor> finalOutput;

// Fill input tensor with your input data

std::string InputName = "input"; // Your input placeholder's name
std::string OutputName = "softmax/Reshape_1"; // Your output placeholder's name

session->Run({ { InputName, input_tensor } }, { OutputName }, {}, &finalOutput);

// finalOutput will contain the inference output that you search for

这篇关于从Python导出Tensorflow图以在C ++中使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆