如何在 Qt 中部署 openvino-opencv [英] how to deploy openvino-opencv in Qt

查看:176
本文介绍了如何在 Qt 中部署 openvino-opencv的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在基于 Qt (Qt5.7.1) 的项目中使用 openvino-opencv.我已经按照 windows10 https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_windows.html#Configure_MO.我编写了一个 .pri 文件来在 Qt 中部署 opencv:

I want to use openvino-opencv for my Qt (Qt5.7.1) based project. I have downloaded and installed openvino411 (corresponding to opencv411) following the instructions here in windows10 https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_windows.html#Configure_MO. I write a .pri file to demploy the opencv in Qt:

INCLUDEPATH += C:/openvino-411/openvino_2019.2.275/opencv/include

CONFIG(release, debug|release):{
    LIBS += -LC:/openvino-411/openvino_2019.2.275/opencv/lib \
            -lopencv_core411 -lopencv_highgui411 -lopencv_imgproc411 -lopencv_imgcodecs411 -lopencv_features2d411 -lopencv_ml411 -lopencv_objdetect411 -lopencv_dnn411
}
CONFIG(debug, debug|release):{
    LIBS += -LC:/openvino-411/openvino_2019.2.275/opencv/lib \
            -lopencv_core411d -lopencv_highgui411d -lopencv_imgproc411d -lopencv_imgcodecs411d -lopencv_features2d411d -lopencv_ml411d -lopencv_objdetect411d -lopencv_dnn411d
}

但是好像opencv不能在Qt中运行,因为我试过运行qt程序.弹出的 cmd 窗口直接进入Press to close this window...",实际上没有做任何事情.

But it seeems opencv canot be run in Qt, since I tried running the qt program. The popping up cmd window goes directly to "Press <RETURN> to close this window..." without doing any actually.

推荐答案

首先要记住 OpenVINO for windows 是针对 MSBUILD 而不是 MinGW 编译的,所以如果你的 Qt 项目是使用 MinGW 编译的,OpenVINO 是预编译的链接期间库可能会失败

First of all, keep in mind that OpenVINO for windows is compiled against MSBUILD instead of MinGW, so if your Qt project is compiled using MinGW, OpenVINO pre-built libraries will likely fail during linking

也就是说,我成功地将 OpenVINO 推理引擎与 OpenCV 集成到一个大型且已经存在的基于 Qt 的项目 (QT 5.13.1) 中,在 LINUX (Ubuntu 16.04) 下,它看起来在 Windows 下,依赖项碎片使得它变得更加困难

That said, I managed to integrate OpenVINO Inference Engine with OpenCV succesfully in a big and already existent Qt based project (QT 5.13.1), under LINUX (Ubuntu 16.04), it apperas that under Windows the dependencies fragmentation makes it harder

此配置非常棘手,而且还在进行中(对我而言),我正在尝试完全隔离 OpenVINO 依赖项,旨在将它们完全嵌入我们的应用程序中,无论如何它都是这样工作的:

This configuration is quite tricky and also is a work in progress (to me), I am trying to completely isolate OpenVINO dependencies aiming to deploy them completely embedded in our app, anyway like this it works:

首先我安装了 OpenVINO (https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_linux.html) 特别注意按照所描述的准确执行每个步骤,

First I installed OpenVINO (https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_linux.html) paying particular attention in following each step precisely as it is described,

也不要错过运行两个示例 demo_security_barrier_camera 和 demo_squeezenet_download_convert_run,它们将生成两个库 libcpu_extension.so 和 libgflags_nothreads.a 如果没有 OpenVINO 将无法在您的项目下工作,其原因未知我

also DON'T MISS TO RUN the two examples demo_security_barrier_camera and demo_squeezenet_download_convert_run, they will produce two libraries libcpu_extension.so and libgflags_nothreads.a WITHOUT WHICH OpenVINO WILL NOT WORK UNDER YOUR PROJECT, the reason why it was made this way is unknown to me

我在我的项目 (ThirdPartyLibraries/OpenVINOInferenceEngine) 的子文件夹下复制了以下库:

I copied the following libraries under a subfolder of my project (ThirdPartyLibraries/OpenVINOInferenceEngine):

  • libinference_engine.so(位于 OpenVINO 安装文件夹:/opt/intel/openvino/inference_engine/lib/intel64/libinference_engine.so)
  • libtbb.so(位于 OpenVINO 安装文件夹:/opt/intel/openvino/inference_engine/external/tbb/lib/intel64/libtbb.so)
  • libinference_engine.so (found in OpenVINO installation folder: /opt/intel/openvino/inference_engine/lib/intel64/libinference_engine.so)
  • libtbb.so (found in OpenVINO installation folder: /opt/intel/openvino/inference_engine/external/tbb/lib/intel64/libtbb.so)

对于两个cpu extension"库,我创建了一个名为extension"的子文件夹,所以:

for the two "cpu extension" libraries, I created a subfolder named "extension", so:

  • extension/libgflags_nothreads.a(在 OpenVINO Inference Engine Demo BUILD FOLDER 中找到,对我来说是/home/myuser/inference_engine_demos_build/Release/lib/libgflags_nothreads.a)
  • extension/libcpu_extensio.so(在 OpenVINO Inference Engine Demo BUILD FOLDER 中找到,对我来说是/home/myuser/inference_engine_demos_build/Release/lib/libcpu_extensio.so)
  • extension/libgflags_nothreads.a (found in OpenVINO Inference Engine Demo BUILD FOLDER, for me it is /home/myuser/inference_engine_demos_build/Release/lib/libgflags_nothreads.a)
  • extension/libcpu_extensio.so (found in OpenVINO Inference Engine Demo BUILD FOLDER, for me it is /home/myuser/inference_engine_demos_build/Release/lib/libcpu_extensio.so)

然后我还将推理引擎和 Lib Cpu Extension 的包含从它们各自的安装文件夹复制到我的第三方库:

Then I also copied the includes of Inference Engine and Lib Cpu Extension from their respective installation folders to my ThirdPartyLibraries:

  • /opt/intel/openvino/inference_engine/include/ 下找到的所有内容都在 /ThirdPartyLibraries/OpenVINOInferenceEngine/include 下
  • /opt/intel/openvino/deployment_toos/inference_engine/src/extension/ 下找到的所有内容都在 /ThirdPartyLibraries/OpenVINOInferenceEngine/extension/include 下
  • All the content found under /opt/intel/openvino/inference_engine/include/ goes under /ThirdPartyLibraries/OpenVINOInferenceEngine/include
  • All the content found under /opt/intel/openvino/deployment_toos/inference_engine/src/extension/ goes under /ThirdPartyLibraries/OpenVINOInferenceEngine/extension/include

最后是我的 Qt .pri 文件:

Finally here's my .pri file for Qt:

OPENVINODIR = /home/myuser/code_qt5_HG/Libraries/ThirdPartyLibraries/OpenVINOInferenceEngine

LIBS_OPENVINO  += -L$$OPENVINODIR \
                  -linference_engine \
                  -ltbb \
                  -L$$OPENVINODIR/extension \
                  -lcpu_extension

INCLUDES_OPENVINO  += $$OPENVINODIR/include \
                   += $$OPENVINODIR/extension/include

LIBS += $$LIBS_OPENVINO

INCLUDEEPATH += $$INCLUDES_OPENVINO

就是这样,这样做可以让我在我的项目中像这样引用和使用推理引擎:

That's it, doing so allows me to reference and use Inference Engine in my project like this:

 #include <ie_core.hpp>
 #include <ie_plugin_config.hpp>
 #include <cpp/ie_cnn_net_reader.h>
 #include <ext_list.hpp>

 .....

 InferenceEngine::Core ie;
 ie.AddExtension(std::make_shared<InferenceEngine::Extensions::Cpu::CpuExtensions>(), "CPU");
 InferenceEngine::CNNNetReader netReader;
 netReader.ReadNetwork(detectorXmlPath);
 netReader.getNetwork().setBatchSize(1);
 netReader.ReadWeights(detectorBinPath);
 InferenceEngine::InputsDataMap inputInfo(netReader.getNetwork().getInputsInfo());

 .....

要将我的应用程序部署到第三方机器上,我需要按照常规程序在机器上安装 OpenVINO (https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_linux.html) 并像往常一样部署我的应用程序,然后正确解析依赖项.

to deploy my App to a third party machine I need to install OpenVINO on the machine following the regular procedure (https://docs.openvinotoolkit.org/latest/_docs_install_guides_installing_openvino_linux.html) and to deploy my App as I usually do, the dependencies are then correctly resolved.

我的最后两分钱:我与英特尔直接联系,英特尔支持我进行 OpenVINO 集成,据他们说/deployment_tools/inference_engine/lib/intel64 中的所有 .so 文件,来自/deployment_tools/inference_engine/external/mkltiny_lnx/lib 和/deployment_tools/inference_engine/external/tbb/lib 几乎是所有需要的依赖项",我还没来得及确认

My last two cents: I am in direct contact with Intel, which is supporting me with the OpenVINO integration, according to them "all the .so files in in /deployment_tools/inference_engine/lib/intel64, from /deployment_tools/inference_engine/external/mkltiny_lnx/lib, and /deployment_tools/inference_engine/external/tbb/lib are pretty much all the dependencies required", I still didn't have the time to confirm that yet

这篇关于如何在 Qt 中部署 openvino-opencv的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆