在Android Studio Project中使用Tensorflow Lite C ++ API的问题 [英] Problems with using tensorflow lite C++ API in Android Studio Project

查看:618
本文介绍了在Android Studio Project中使用Tensorflow Lite C ++ API的问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前正在从事有关神经网络的项目. 为此,我想构建一个Android应用程序,该应用程序应使用tensorflow [lite]解决一些对象检测/识别问题.

由于我希望代码尽可能地可移植,因此我想用C ++编写大部分代码,因此在Java API/包装器上使用tensorflow lite的C ++ API. 因此,我修改了tensorflow/contrib/lite/BUILD并添加了以下内容,以便能够创建共享的tensorflow库.

cc_binary(
name = "libtensorflowLite.so",

linkopts=["-shared", "-Wl"],
linkshared=1,

copts = tflite_copts(),
deps = [
    ":framework",
    "//tensorflow/contrib/lite/kernels:builtin_ops",
],

)

(基于此问题的答案: https://github.com/tensorflow/tensorflow/issues/17826 )

然后我用

bazel build //tensorflow/contrib/lite:libtensorflowLite.so --crosstool_top=//external:android/crosstool --cpu=arm64-v8a --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --cxxopt="-std=c++11"

最终构建它.

此后,我转到Android Studio并建立了一个基本项目. 要将共享库添加到项目中,我参考了以下示例:

https://github.com/googlesamples/android-ndk/tree/840858984e1bb8a7fab37c1b7c571efbe7d6eb75/hello-libs

我还添加了平面缓冲区所需的依赖项.

构建/编译过程成功完成,没有任何链接器错误(嗯,至少在尝试了几个小时之后.).

然后,该APK已成功安装在Android设备上,但在启动后立即崩溃. Logcat提供以下输出:

04-14 20:09:59.084 9623-9623/com.example.hellolibs E/AndroidRuntime: FATAL EXCEPTION: main
    Process: com.example.hellolibs, PID: 9623
    java.lang.UnsatisfiedLinkError: dlopen failed: library "/home/User/tensorflowtest/app/src/main/cpp/../../../../distribution/tensorflow/lib/x86/libtensorflowLite.so" not found
        at java.lang.Runtime.loadLibrary0(Runtime.java:1016)
        at java.lang.System.loadLibrary(System.java:1657)
        at com.example.hellolibs.MainActivity.<clinit>(MainActivity.java:36)
        at java.lang.Class.newInstance(Native Method)
        at android.app.Instrumentation.newActivity(Instrumentation.java:1174)
        at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2669)
        at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2856)
        at android.app.ActivityThread.-wrap11(Unknown Source:0)
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1589)
        at android.os.Handler.dispatchMessage(Handler.java:106)
        at android.os.Looper.loop(Looper.java:164)
        at android.app.ActivityThread.main(ActivityThread.java:6494)
        at java.lang.reflect.Method.invoke(Native Method)
        at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:438)
        at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)

我在android x86模拟器和真实的arm64-v8a android智能手机上尝试过此操作.

所以对我来说,这看起来像启动时应用程序尝试加载tensorflowLite共享库,但找不到它. 使用zip存档管理器打开apk,我可以验证平台(手臂,x86)相关的.so文件是否按预期方式打包到APK中(通过在build.gradle中添加以下内容:

sourceSets {
        main {
            // let gradle pack the shared library into apk
            jniLibs.srcDirs = ['../distribution/tensorflow/lib']
        }
})

我不明白的是为什么它会在我将其放置在Ubuntu 17.10 PC上的路径中查找该库. 因此,我认为尝试修改将外部库添加到前面提到的Android Studio项目中的示例时,我做错了. 这就是为什么我下载整个项目并在Android Studio中打开它,并验证该示例是否按预期工作的原因.之后,我将示例libgperf.so替换为libtensorflowLite.so,并保留了所有其他内容,尤其是CMakeLists.txt. 但是我又得到了完全相同的错误,因此我怀疑这是libtensorflowLite库本身的问题,而不是android项目的问题(尽管这只是我的猜测).

我正在使用android studio 3.1.1,NDK版本14和API级别24(Android 7.0). 如果有人知道什么地方可能出了问题,我们将不胜感激. 我也对允许我在Android应用程序中使用tensorflow lite和C ++的任何其他方法持开放态度.

非常感谢

马丁

解决方案

我只是记得几周前问过这个问题. 同时,我找到了解决问题的方法,并且TensorflowLite现在可以很好地嵌入到我的Android项目中,在那里我可以使用C ++ API进行所有编程!

问题是我构建的Tensorflow共享库不包含soname.因此,在构建过程中,该库被剥离,并且由于未找到名称,因此将该路径用作名称".我注意到,在我进一步研究了我的native-lib.so(NDK C ++库,然后由App加载)时,使用了Linux的字符串"工具.在这里,我发现确实从"/home/User/tensorflowtest/app/src/main/cpp/../../../../distribution/tensorflow/lib/x86/libtensorflowLite加载库的路径.so"已设置. 在构建文件的构建选项中添加"-Wl,-soname = libtensorflowLite.so"可解决此问题!您可以在下面找到我使用的整个规则.

由于缺乏解释,因此很难设置所有内容(似乎TensorflowLite主要通过Android上的Java API使用?),我想就如何在TensorflowLite的C ++ API中使用提供简短指导Android Studio(来自Android NDK项目).

1. 为您的体系结构构建库

要使用C ++ API,您首先需要构建TensorflowLite库.为此,将以下规则添加到tensorflow/contrib/lite中的BUILD文件中:

cc_binary(

name = "libtensorflowLite.so",
linkopts=[
    "-shared", 
    "-Wl,-soname=libtensorflowLite.so",
],
linkshared = 1,
copts = tflite_copts(),
deps = [
    ":framework",
    "//tensorflow/contrib/lite/kernels:builtin_ops",
],

)

注意:这样,就可以建立共享库!静态的也可以.

现在您可以使用构建库

bazel build //tensorflow/contrib/lite:libtensorflowLite.so --crosstool_top=//external:android/crosstool --cpu=arm64-v8a --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --cxxopt="-std=c++11"

如果要支持多种体系结构,则必须多次构建该库并相应地更改--cpu标志.

注意::此功能至少对arm64-v8a和armeabi-v7a有效(尚未使用MIPS进行过测试,因此也可以使用).但是,在x86设备上,此主题中已经解决了"atomic_store_8"错误: https://github.com/tensorflow/tensorflow/issues/16589

2. 添加要包含在Android Studio项目中的库和所需的标头

已经构建了该库,现在需要确保它也已链接到您的应用程序(更具体地说:进入您的Android NDK库,在我的情况下称为"native-lib").我将简要介绍如何执行此操作,但是,如果您需要更详细的说明,则可以参考我在最初的问题中提供的github链接:

2.3.打开您的Module:App(不是项目一个!)的build.gradle

2.4.确保我们的资料库将打包到您的APK中

将其添加到Android部分:

    sourceSets {
        main {
            // let gradle pack the shared library into apk
            jni.srcDirs = []
            jniLibs.srcDirs = ['distribution/lib']
        }
    }

您可能需要根据需要编辑路径:此处的文件将打包到lib目录中的.apk中.

3.包括平面缓冲区

TensorflowLite使用平面缓冲区序列化库.我想如果您使用bazel构建项目,这将自动添加.但这不是使用Android Studio时的情况. 当然,您也可以添加静态或共享库. 但是,对我来说,最简单的方法是让Flatbuffer每次与我的应用程序的其余部分一起编译(这并不大). 我将所有平面缓冲区 *.cpp 源文件复制到了我的项目中,并将它们添加到CMakeLists中.

4.复制TensorflowLite和flatbuffers所需的标头

在3中,我只是将cpp文件复制到了我的项目中. 但是,头文件必须位于我们在步骤2.2中在target_include_directories中设置的目录中.

因此,继续并将所有flatbuffers(从flatbuffers存储库中)*.h文件复制到此目录. 接下来,从TensorflowLite存储库中,您需要tensorflow/contrib/lite目录中的所有头文件.但是,您应该保持文件夹结构

对我来说看起来像这样:

  • 分布
    • lib
      • arm64-v8a
        • libtensorflowLite
      • armeabi-v7a
        • libtensorflowLite
    • 包括
      • flatbuffers
      • 张量流
        • 贡献
          • 精简版
            • 内核
            • nnapi
            • 模式
            • 工具

因此,如果我没有忘记任何内容,那么现在应该正确设置所有内容! 希望这对您有所帮助,对我一样;)

最诚挚的问候,

马丁

I am currently working on a project regarding neural networks. For this, I want to build an Android Application which should use tensorflow [lite] to solve some object detection / recognition problems.

As I want the code to be as portable as possible, I want to write most of the code in C++, thus using the C++ API of tensorflow lite over the Java API / wrapper. So, I modified tensorflow/contrib/lite/BUILD and added the following to be able to create a shared tensorflow library.

cc_binary(
name = "libtensorflowLite.so",

linkopts=["-shared", "-Wl"],
linkshared=1,

copts = tflite_copts(),
deps = [
    ":framework",
    "//tensorflow/contrib/lite/kernels:builtin_ops",
],

)

(Which is based on the answer to this issue: https://github.com/tensorflow/tensorflow/issues/17826)

Then I used

bazel build //tensorflow/contrib/lite:libtensorflowLite.so --crosstool_top=//external:android/crosstool --cpu=arm64-v8a --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --cxxopt="-std=c++11"

to finally build it.

Afterwards I headed over to Android Studio and set up a basic project. For adding the shared library to the project, I refered to this example:

https://github.com/googlesamples/android-ndk/tree/840858984e1bb8a7fab37c1b7c571efbe7d6eb75/hello-libs

I also added the needed dependencies for flatbuffers.

The build / compilation process succeeds without any linker errors (well, at least after trying around for some hours..).

The APK is then successfully installed on an Android device, but immediately crashes after it starts. Logcat gives the following output:

04-14 20:09:59.084 9623-9623/com.example.hellolibs E/AndroidRuntime: FATAL EXCEPTION: main
    Process: com.example.hellolibs, PID: 9623
    java.lang.UnsatisfiedLinkError: dlopen failed: library "/home/User/tensorflowtest/app/src/main/cpp/../../../../distribution/tensorflow/lib/x86/libtensorflowLite.so" not found
        at java.lang.Runtime.loadLibrary0(Runtime.java:1016)
        at java.lang.System.loadLibrary(System.java:1657)
        at com.example.hellolibs.MainActivity.<clinit>(MainActivity.java:36)
        at java.lang.Class.newInstance(Native Method)
        at android.app.Instrumentation.newActivity(Instrumentation.java:1174)
        at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2669)
        at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2856)
        at android.app.ActivityThread.-wrap11(Unknown Source:0)
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1589)
        at android.os.Handler.dispatchMessage(Handler.java:106)
        at android.os.Looper.loop(Looper.java:164)
        at android.app.ActivityThread.main(ActivityThread.java:6494)
        at java.lang.reflect.Method.invoke(Native Method)
        at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:438)
        at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:807)

I tried this on an android x86 emulator and on a real arm64-v8a android smartphone.

So for me this looks like on startup the application tries to load the tensorflowLite shared library, but is unable to find it. Opening the apk with a zip archive manager I can verify that the platform (arm, x86) dependent .so files are packed into the APK as expected (by adding the following to build.gradle:

sourceSets {
        main {
            // let gradle pack the shared library into apk
            jniLibs.srcDirs = ['../distribution/tensorflow/lib']
        }
})

What I do not understand is why it looks for the library in the path where I placed it on my Ubuntu 17.10 PC. So, I thought I had done a mistake trying to adapt the example about adding external libraries to an Android Studio Project I mentioned earlier. That's why I downloaded the whole project and opened it in Android Studio and verified that the example works as expected. Afterwards I replaced the example libgperf.so by the libtensorflowLite.so and left everything else, especially the CMakeLists.txt, untouched. But I get the exact same error again, therefore I suspect this to be a problem with the libtensorflowLite library itself and not the android project (although that's just my guess).

I am working on android studio 3.1.1, NDK Version 14 and API Level 24 (Android 7.0). If anyone has an idea what could be wrong, any help would be highly appreciated. I am also open for any other methods which allow me to use tensorflow lite with C++ for an android application.

Thanks a lot,

Martin

解决方案

I just remembered I asked this question a few weeks ago. Meanwhile, I found a solution to the problem and TensorflowLite is now nicely embedded into my Android Project, where I do all the programming using the C++ API!

The problem was that the Tensorflow shared library I built did not contain a soname. So, during build process, the library was stripped and as no name was found, the path was used as the "name". I noticed that while I further investigated my native-lib.so (the NDK C++ library which is then loaded by the App) using linux "strings" tool. Here I found out that indeed the path to load the library from "/home/User/tensorflowtest/app/src/main/cpp/../../../../distribution/tensorflow/lib/x86/libtensorflowLite.so" was set. Adding a "-Wl,-soname=libtensorflowLite.so" to the build options in the BUILD file fixed this issue! You can find the whole rule I used below.

As it was a pain to get everything set up due to the lack of explanations (it seems TensorflowLite is mostly used via Java API on Android ?), I want to give a short guidance on how use the C++ API of TensorflowLite in Android Studio (from within an Android NDK project).

1. Build the library for your architecture

To use the C++ API, you first need to build the TensorflowLite library. For this, add the following rule to the BUILD file in tensorflow/contrib/lite:

cc_binary(

name = "libtensorflowLite.so",
linkopts=[
    "-shared", 
    "-Wl,-soname=libtensorflowLite.so",
],
linkshared = 1,
copts = tflite_copts(),
deps = [
    ":framework",
    "//tensorflow/contrib/lite/kernels:builtin_ops",
],

)

Note: With this, a shared library can be built! A static one might also work.

Now you can build the library using

bazel build //tensorflow/contrib/lite:libtensorflowLite.so --crosstool_top=//external:android/crosstool --cpu=arm64-v8a --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --cxxopt="-std=c++11"

If you want to support multiple architectures, you will have to build the library several times and change the --cpu flag correspondingly.

NOTE: This works fine at least for arm64-v8a and the armeabi-v7a (haven't tested it with MIPS so this might work aswell). However on an x86 device, I get the "atomic_store_8" error already adressed in this topic: https://github.com/tensorflow/tensorflow/issues/16589

2. Add the library and the needed headers to be included in your Android Studio project

Having built the library, you now need to make sure it also is linked into your Application (more specifically: Into your Android NDK library, which in my case is named "native-lib"). I will give a short overview on how to do this, however if you need a more detailed explanation you may refer to the github link I provided in my initial question: https://github.com/googlesamples/android-ndk/tree/840858984e1bb8a7fab37c1b7c571efbe7d6eb75/hello-libs

2.1. In your Android Studio Project, open the CMakeLists.txt

2.2. Add the following:

    # This will create a new "variable" holding the path to a directory
    # where we will put our library and header files.
    # Change this to your needs
    set(distribution_DIR ${CMAKE_SOURCE_DIR}/distribution)

    # This states that there exists a shared library called libtensorflowLite
    # which will be imported (means it is not built with the rest of the project!)
    add_library(libtensorflowLite SHARED IMPORTED)

    # This indicates where the libtensorflowLite.so for each architecture is found relative to our distribution directory
    set_target_properties(libtensorflowLite PROPERTIES IMPORTED_LOCATION
        ${distribution_DIR}/lib/${ANDROID_ABI}/libtensorflowLite.so)

    # This indicates where the header files are found relative to our distribution dir
    target_include_directories(native-lib PRIVATE
                       ${distribution_DIR}/include)

    # Finally, we make sure our libtensorflowLite.so is linked to our native-lib and loaded during runtime 
    target_link_libraries( # Specifies the target library.
                   native-lib
                   libtensorflowLite
                   # Links the target library to the log library
                   # included in the NDK.
                   ${log-lib} )

2.3. Open the build.gradle for your Module: App (not the project one!)

2.4. Make sure our library will be packed into your APK

Add this inside the Android section:

    sourceSets {
        main {
            // let gradle pack the shared library into apk
            jni.srcDirs = []
            jniLibs.srcDirs = ['distribution/lib']
        }
    }

You may have to edit the path accoding to your needs: The files here will be packed in to your .apk inside the lib directory.

3. Include flatbuffers

TensorflowLite uses the flatbuffers serialization library. I guess this will be added automatically if you build your project using bazel. But this is not the case when using Android Studio. Of course, you could also add a static or shared library too. However, for me it was easiest to just let flatbuffers compile each time with the rest of my app (it is not that big). I copied all of the flatbuffers *.cpp source files to my project and added them to the CMakeLists.

4. Copy the needed headers for TensorflowLite and flatbuffers

In 3. I just copied the cpp files to my project. However, the header files need to be located in the directory we set in target_include_directories in step 2.2.

So go ahead and copy all of the flatbuffers (from the flatbuffers repository) *.h files to this directory. Next, from the TensorflowLite repository, you need all header files inside the tensorflow/contrib/lite directory. However you should keep the folder structure

For me it looks like this:

  • distribution
    • lib
      • arm64-v8a
        • libtensorflowLite
      • armeabi-v7a
        • libtensorflowLite
    • include
      • flatbuffers
      • tensorflow
        • contrib
          • lite
            • kernels
            • nnapi
            • schema
            • tools

So, if I haven't forgotten anything everything should be set up correctly by now! Hopefully this helped and it worked for you as it did for me ;)

Best regards,

Martin

这篇关于在Android Studio Project中使用Tensorflow Lite C ++ API的问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆