在没有安装Tensorflow的情况下运行Tensorflow模型 [英] Run a Tensorflow model without having Tensorflow installed
问题描述
我有一个运行良好的TF模型,该模型是使用Python和TFlearn构建的.有没有一种方法可以在另一个系统上运行该模型而无需在其上安装Tensorflow?它已经进行了预训练,所以我只需要通过它来运行数据.
我知道tfcompile(这里的线程),但似乎设置起来非常复杂.有其他选择吗?
是否可以在不安装Tensorflow的情况下在另一个系统上运行此模型?它已经进行了预训练,所以我只需要通过它来运行数据.
是
训练完模型后.使用tf.python.tools.freeze_graph
和tf.python.tools.optimize_for_inference_lib
冻结并优化模型以在其他设备(例如Android)上进行推理.
上面的输出将是
- 冻结图原始文件(.pb)
- 优化的图形protobuf文件(.pb)
[这些函数会将模型的所有变量转换为常量运算并导出到protobuf文件中]
使用优化的图形protobuf文件,并使用Java和其他Tensorflow API中可用的推理方法将其加载.传递数据并获得输出.
[注意,您没有安装完整的Tensorflow,但只需要推理库]
一个简单的例子在这里展示:
https://omid.al/posts/2017-02-20-Tutorial-Build-Your-First-Tensorflow-Android-App.html
它适用于Android,但过程应与Java相同.
对于C ++:Thread here), but it seems quite complex to set up. Are there any alternatives?
Is there a way to run this model on another system without installing Tensorflow on it? It is already pre-trained, so I just need to run data through it.
Yes
After you have your model trained . Use tf.python.tools.freeze_graph
and tf.python.tools.optimize_for_inference_lib
to freeze and optimize the model for inference on other devices like Android.
The output of the above will be
- Frozen graph protobuf file (.pb)
- Optimized graph protobuf file (.pb)
[These functions will converts all the Variables of the Model to Constant Operations and exports to a protobuf file]
Use the optimized graph protobuf file and load it using Inference methods available in Java and other Tensorflow APIs. Pass the data and get the output.
[ Note for this you didn't installed complete Tensorflow but you only needed the inference library]
A Simple example is demonstrated here :
https://omid.al/posts/2017-02-20-Tutorial-Build-Your-First-Tensorflow-Android-App.html
It is for Android but procedure should be same for Java.
For C++ :click here
这篇关于在没有安装Tensorflow的情况下运行Tensorflow模型的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!