从C ++中的Tensorflow的.meta文件加载图形进行推理 [英] loading a graph from .meta file from Tensorflow in c++ for inference

查看:43
本文介绍了从C ++中的Tensorflow的.meta文件加载图形进行推理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经使用tensorflow 1.5.1训练了一些模型,并且拥有这些模型的检查点(包括.ckpt和.meta文件).现在,我想使用这些文件在c ++中进行推理.

I have trained some models using tensorflow 1.5.1 and I have the checkpoints for those models (including .ckpt and .meta files). Now I want to do inference in c++ using those files.

在python中,我将执行以下操作来保存和加载图形和检查点.保存:

In python, I would do the following to save and load the graph and the checkpoints. for saving:

    images = tf.placeholder(...) // the input layer
    //the graph def
    output = tf.nn.softmax(net) // the output layer
    tf.add_to_collection('images', images)
    tf.add_to_collection('output', output)

为了推理,我还原了图和检查点,然后从集合中还原了输入和输出层,如下所示:

for inference i restore the graph and the checkpoint then restore the input and output layers from collections like so:

    meta_file = './models/last-100.meta'
    ckpt_file = './models/last-100'
    with tf.Session() as sess:
        saver = tf.train.import_meta_graph(meta_file)
        saver.restore(sess, ckpt_file)
        images = tf.get_collection('images')
        output = tf.get_collection('output')
        outputTensors = sess.run(output, feed_dict={images: np.array(an_image)})

现在假设我像往常一样在python中进行了保存,如何使用像python中一样的简单代码在c ++中进行推理和还原?

now assuming that I did the saving in python as usual, how can I do inference and restore in c++ with simple code like in python?

我已经找到了示例和教程,但是对于tensorflow版本0.7 0.12,相同的代码不适用于版本1.5.我在tensorflow网站上没有找到使用c ++ API还原模型的教程.

I have found examples and tutorials but for tensorflow versions 0.7 0.12 and the same code doesn't work for version 1.5. I found no tutorials for restoring models using c++ API on tensorflow website.

推荐答案

为此

For the sake of this thread. I will rephrase my comment into an answer.

发布完整示例将需要CMake设置或将文件放入特定目录中以运行bazel.因为我确实喜欢第一种方法,所以它将打破本文的所有限制,以覆盖我要重定向到的所有部分.完全在C99,C ++,GO中实现而没有Bazel (我已针对TF> v1.5测试过).

Posting a full example would require either a CMake setup or putting the file into a specific directory to run bazel. As I do favor the first way and it would burst all limits on this post to cover all parts I would like to redirect to a complete implementation in C99, C++, GO without Bazel which I tested for TF > v1.5.

在C ++中加载图形并不比在Python中困难得多,鉴于,您已经从源代码编译了TensorFlow.

Loading a graph in C++ is not much more difficult than in Python, given you compiled TensorFlow already from source.

首先创建一个MWE,它会创建一个非常转储的网络图,始终是弄清楚事情如何工作的好主意:

Start by creating a MWE, which creates a very dump network graph is always a good idea to figure out how things work:

import tensorflow as tf

x = tf.placeholder(tf.float32, shape=[1, 2], name='input')
output = tf.identity(tf.layers.dense(x, 1), name='output')

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    saver = tf.train.Saver(tf.global_variables())
    saver.save(sess, './exported/my_model')

关于这部分,这里可能有很多答案.所以我就让它留在这里,而无需进一步解释.

There are probably tons of answers here on SO about this part. So I just let it stay here without further explanation.

在使用其他语言编写内容之前,我们可以尝试在python中正确执行操作-从某种意义上说:我们只需要用C ++重写它即可.甚至在python中恢复也很容易,例如:

Before doing stuff in other languages, we can try to do it in python properly -- in the sense: we just need to rewrite it in C++. Even restoring is very easy in python like:

import tensorflow as tf

with tf.Session() as sess:

    # load the computation graph
    loader = tf.train.import_meta_graph('./exported/my_model.meta')
    sess.run(tf.global_variables_initializer())
    loader = loader.restore(sess, './exported/my_model')

    x = tf.get_default_graph().get_tensor_by_name('input:0')
    output = tf.get_default_graph().get_tensor_by_name('output:0')

这没有帮助,因为大多数这些API端点在C ++ API中都不存在(还可以吗?).一个替代版本是

it is not helpful as most of these API endpoints do not exists in the C++ API (yet?). An alternative version would be

import tensorflow as tf

with tf.Session() as sess:

    metaGraph = tf.train.import_meta_graph('./exported/my_model.meta')
    restore_op_name = metaGraph.as_saver_def().restore_op_name
    restore_op = tf.get_default_graph().get_operation_by_name(restore_op_name)
    filename_tensor_name = metaGraph.as_saver_def().filename_tensor_name
    sess.run(restore_op, {filename_tensor_name: './exported/my_model'})


    x = tf.get_default_graph().get_tensor_by_name('input:0')
    output = tf.get_default_graph().get_tensor_by_name('output:0')

等一下.您始终可以使用 print(dir(object))获取诸如 restore_op_name ,...之类的属性.像其他所有操作一样,恢复模型是TensorFlow中的一个操作.我们只是调用此操作并提供路径(字符串张量)作为输入.我们甚至可以编写自己的 restore 操作

Hang on. You can always use print(dir(object)) to get the properties like restore_op_name, ... . Restoring a model is an operation in TensorFlow like every other operation. We just call this operation and providing the path (a string-tensor) as an input. We can even write our own restore operation

def restore(sess, metaGraph, fn):
    restore_op_name = metaGraph.as_saver_def().restore_op_name   # u'save/restore_all'
    restore_op = tf.get_default_graph().get_operation_by_name(restore_op_name)
    filename_tensor_name = metaGraph.as_saver_def().filename_tensor_name  # u'save/Const'
    sess.run(restore_op, {filename_tensor_name: fn})

即使这看起来很奇怪,现在也可以极大地帮助您在C ++中完成相同的工作.

Even this looks strange, it now greatly helps to do the same stuff in C++.

从平常的东西开始

#include <tensorflow/core/public/session.h>
#include <tensorflow/core/public/session_options.h>
#include <tensorflow/core/protobuf/meta_graph.pb.h>
#include <string>
#include <iostream>

typedef std::vector<std::pair<std::string, tensorflow::Tensor>> tensor_dict;

int main(int argc, char const *argv[]) {

  const std::string graph_fn = "./exported/my_model.meta";
  const std::string checkpoint_fn = "./exported/my_model";

  // prepare session
  tensorflow::Session *sess;
  tensorflow::SessionOptions options;
  TF_CHECK_OK(tensorflow::NewSession(options, &sess));

  // here we will put our loading of the graph and weights

  return 0;
}

您应该能够通过将其放入TensorFlow回购中并使用bazel进行编译,或者只需遵循此处的说明进行编译使用CMake.

You should be able to compile this by either put it in the TensorFlow repo and use bazel or simply follow the instructions here to use CMake.

我们需要创建一个由 tf.train.import_meta_graph 创建的 meta_graph .可以通过

We need to create such a meta_graph created by tf.train.import_meta_graph. This can be done by

tensorflow::MetaGraphDef graph_def;
TF_CHECK_OK(ReadBinaryProto(tensorflow::Env::Default(), graph_fn, &graph_def));

在C ++中,从文件中读取图形与在Python中导入图形相同.我们需要在会话中通过以下方式创建该图

In C++ reading a graph from file is not the same as importing a graph in Python. We need to create this graph in a session by

TF_CHECK_OK(sess->Create(graph_def.graph_def()));

通过查看上面奇怪的python restore 函数:

By looking at the strange python restore function above:

restore_op_name = metaGraph.as_saver_def().restore_op_name
restore_op = tf.get_default_graph().get_operation_by_name(restore_op_name)
filename_tensor_name = metaGraph.as_saver_def().filename_tensor_name

我们可以用C ++编写等效的代码

we can code the equivalent piece in C++

const std::string restore_op_name = graph_def.saver_def().restore_op_name()
const std::string filename_tensor_name = graph_def.saver_def().filename_tensor_name()

在适当的位置运行该操作

Having this in place, we just run the operation by

sess->Run(feed_dict,     // inputs
          {},            // output_tensor_names (we do not need them)
          {restore_op},  // target_node_names
          nullptr)       // outputs (there are no outputs this time)

创建feed_dict可能只是一个帖子,这个答案已经足够长了.它仅涵盖最重要的内容.我想重定向到在没有Bazel的C99,C ++,GO中完整实现,我已经为TF测试了>v1.5.这并不难-如果

Creating the feed_dict is probably a post on its own and this answer is already long enough. It does only cover the most important stuff. I would like to redirect to a complete implementation in C99, C++, GO without Bazel which I tested for TF > v1.5. This is not that hard -- it just can get very long in the case of the plain C version.

这篇关于从C ++中的Tensorflow的.meta文件加载图形进行推理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆