使用NVIDIA TensorRT推理引擎运行Tensorflow [英] Run Tensorflow with NVIDIA TensorRT Inference Engine
问题描述
我想使用NVIDIA TensorRT运行我的Tensorflow模型.目前,TensorRT支持Caffe prototxt网络描述符文件.
I would like to use NVIDIA TensorRT to run my Tensorflow models. Currenly, TensorRT supports Caffe prototxt network descriptor files.
我找不到将Tensorflow模型转换为Caffe模型的源代码.有什么解决方法吗?
I was not able to find source code to convert Tensorflow models to Caffe models. Are there any workarounds?
推荐答案
TensorRT 3.0通过其UFF(通用框架格式)支持TensorFlow图的导入/转换.缺少某些层的实现,需要通过IPlugin接口进行自定义实现.
TensorRT 3.0 supports import/conversion of TensorFlow graphs via it's UFF (universal framework format). Some layer implementations are missing and will require custom implementations via IPlugin interface.
以前的版本不支持TensorFlow模型/检查点的本机导入.
Previous versions didn't support native import of TensorFlow models/checkpoints.
您还可以执行的操作是将图层/网络描述导出为您自己的中间格式(例如文本文件),然后使用TensorRT C ++ API构造用于推理的图形.您必须分别导出卷积权重/偏差.请务必注意权重格式-TensorFlow使用NHWC,而TensorRT使用NCHW.对于权重,TF使用RSCK([filter_height,filter_width,input_depth,output_depth]),而TensorRT使用KCRS.
What you can also do is export the layers/network description into your own intermediate format (such as text file) and then use TensorRT C++ API to construct the graph for inference. You'd have to export the convolution weights/biases separately. Make sure to pay attention to weight format - TensorFlow uses NHWC while TensorRT uses NCHW. And for the weights, TF uses RSCK ([filter_height, filter_width, input_depth, output_depth]) and TensorRT uses KCRS.
有关张量格式的扩展讨论,请参见本文: https://arxiv.org/abs/1410.0759
See this paper for an extended discussion of tensor formats: https://arxiv.org/abs/1410.0759
此链接还包含有用的相关信息: https://www.tensorflow.org/versions/master/extend/tool_developers/
Also this link has useful relevant info: https://www.tensorflow.org/versions/master/extend/tool_developers/
这篇关于使用NVIDIA TensorRT推理引擎运行Tensorflow的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!