如何使用tf.lite.Interpreter(在python中)使用GPU运行tflite模型(* .tflite)? [英] How can I use GPU for running a tflite model (*.tflite) using tf.lite.Interpreter (in python)?

查看:726
本文介绍了如何使用tf.lite.Interpreter(在python中)使用GPU运行tflite模型(* .tflite)?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

根据https://www.tensorflow.org/lite/convert .

我在具有4个Nvidia TITAN GPU的GPU服务器上测试了tflite模型.我使用tf.lite.Interpreter加载并运行tflite模型文件.

I tested the tflite model on my GPU server, which has 4 Nvidia TITAN GPUs. I used the tf.lite.Interpreter to load and run tflite model file.

它像以前的张量流图一样工作,但是问题是推论变得太慢了.当我检查出原因之后,我发现当tf.lite.Interpreter运行时,GPU利用率仅为0%.

It works as the former tensorflow graph, however, the problem is that the inference became too slow. When I checked out the reason, I found that the GPU utilization is simply 0% when tf.lite.Interpreter is running.

有没有可以在GPU支持下运行tf.lite.Interpreter的方法?

Is there any method that I can run tf.lite.Interpreter with GPU support?

推荐答案

https://github.com/tensorflow/tensorflow/issues/34536

CPU对于tflite来说已经足够好了,尤其是多核.

CPU is kind of good enough for tflite, especially multicore.

nvidia GPU可能未针对tflite(针对移动GPU平台)进行更新.

nvidia GPU likely not updated for tflite, which is for mobile GPU platform.

这篇关于如何使用tf.lite.Interpreter(在python中)使用GPU运行tflite模型(* .tflite)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆