如何在Android的Studio中使用原生的C库 [英] How do I use native C libraries in Android Studio

查看:640
本文介绍了如何在Android的Studio中使用原生的C库的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我创建了一个问题,几年前的基础上的 HTTPS://ikaruga2.word$p$pss.com/2011/06/15/video-live-wallpaper-part-1/ 。我的项目是建立在Eclipse的版本直接提供的谷歌当时并与我的应用程序的名称创建已编译的ffmpeg库的副本正常工作。

现在我试图创建一个基于我的旧应用程序中的新的应用程序。随着谷歌不再支持的Eclipse我下载的Andr​​oid Studio和进口我的项目。有了一些调整,我能成功编译旧版本的项目。所以我修改了名字,复制了一套新的。所以文件到应用程序的\ src \主\ jniLibs \ armeabi(这里我认为他们应该去),并试图与绝对没有其他变化,又重新回到我的手机上的应用程序。

NDK的不引发错误。摇篮编译文件,没有错误,并在我的手机安装。该应用程序会出现在我的生活壁纸列表,我可以单击它,弹出preVIEW。但是,而不是一个视频显示我接受和错误和logcat的报告:

  02-26 21:50:31.164 18757-18757 /? E / AndroidRuntime:致命异常:主要
java.lang.ExceptionInInitializerError
        在com.nightscapecreations.anim3free.VideoLiveWallpaper.onShared$p$pferenceChanged(VideoLiveWallpaper.java:165)
        在com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81)
        在android.app.ActivityThread.handleCreateService(ActivityThread.java:2273)
        在android.app.ActivityThread.access $ 1600(ActivityThread.java:127)
        在android.app.ActivityThread $ H.handleMessage(ActivityThread.java:1212)
        在android.os.Handler.dispatchMessage(Handler.java:99)
        在android.os.Looper.loop(Looper.java:137)
        在android.app.ActivityThread.main(ActivityThread.java:4441)
        在java.lang.reflect.Method.invokeNative(本机方法)
        在java.lang.reflect.Method.invoke(Method.java:511)
        在com.android.internal.os.ZygoteInit $ MethodAndArgsCaller.run(ZygoteInit.java:823)
        在com.android.internal.os.ZygoteInit.main(ZygoteInit.java:590)
        在dalvik.system.NativeStart.main(本机方法)
 java.lang.UnsatisfiedLinkError中:产生的原因无法加载库:link_image [1936年]:144无法加载所需的库'/data/data/com.nightscapecreations.anim1free/lib/libavutil.so'为'libavcore.so(load_library [1091]:图书馆/data/data/com.nightscapecreations.anim1free/lib/libavutil.so未找到)
        在java.lang.Runtime.loadLibrary(Runtime.java:370)
        在java.lang.System.loadLibrary(System.java:535)
        在com.nightscapecreations.anim3free.NativeCalls< clinit>(NativeCalls.java:64)
        在com.nightscapecreations.anim3free.VideoLiveWallpaper.onShared$p$pferenceChanged(VideoLiveWallpaper.java:165)
        在com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81)
        在android.app.ActivityThread.handleCreateService(ActivityThread.java:2273)
        在android.app.ActivityThread.access $ 1600(ActivityThread.java:127)
        在android.app.ActivityThread $ H.handleMessage(ActivityThread.java:1212)
        在android.os.Handler.dispatchMessage(Handler.java:99)
        在android.os.Looper.loop(Looper.java:137)
        在android.app.ActivityThread.main(ActivityThread.java:4441)
        在java.lang.reflect.Method.invokeNative(本机方法)
        在java.lang.reflect.Method.invoke(Method.java:511)
 

我是一个新手的Andr​​oid /的Java / C ++开发,我不知道这是什么错误意味着,但谷歌使我相信,我的新库没有被发现。在我的Eclipse项目我有这样的库中的库\ armeabi,其中有一个更复杂的文件夹结构在JNI \的ffmpeg-的Andr​​oid \编译\的ffmpeg \ armeabi \ LIB另一个副本。 Android的工作室似乎一直保持一切相同,除了重命名库,以jniLibs,但我打一砖墙这个错误,我不能确定如何继续。

我如何编译使用的是Android Studio中的新名称这个新的应用程序吗?

在情况下,它可以帮助这里是我的Andr​​oid.mk文件:

  LOCAL_PATH:= $(叫我-DIR)

    包括$(CLEAR_VARS)
    MY_LIB_PATH:= ffmpeg的-的Andr​​oid /编译/ ffmpeg的/ armeabi / lib目录
    LOCAL_MODULE:= bambuser-libavcore
    LOCAL_SRC_FILES:= $(MY_LIB_PATH)/libavcore.so
    包括$(preBUILT_SHARED_LIBRARY)

    包括$(CLEAR_VARS)
    LOCAL_MODULE:= bambuser-了libavformat
    LOCAL_SRC_FILES:= $(MY_LIB_PATH)/libavformat.so
    包括$(preBUILT_SHARED_LIBRARY)

    包括$(CLEAR_VARS)
    LOCAL_MODULE:= bambuser-libav codeC
    LOCAL_SRC_FILES:= $(MY_LIB_PATH)/libav$c$cc.so
    包括$(preBUILT_SHARED_LIBRARY)

    包括$(CLEAR_VARS)
    LOCAL_MODULE:= bambuser-libavfilter
    LOCAL_SRC_FILES:= $(MY_LIB_PATH)/libavfilter.so
    包括$(preBUILT_SHARED_LIBRARY)

    包括$(CLEAR_VARS)
    LOCAL_MODULE:= bambuser-libavutil
    LOCAL_SRC_FILES:= $(MY_LIB_PATH)/libavutil.so
    包括$(preBUILT_SHARED_LIBRARY)

    包括$(CLEAR_VARS)
    LOCAL_MODULE:= bambuser-libswscale
    LOCAL_SRC_FILES:= $(MY_LIB_PATH)/libswscale.so
    包括$(preBUILT_SHARED_LIBRARY)

    #local_PATH:= $(叫我-DIR)

    包括$(CLEAR_VARS)

    LOCAL_CFLAGS:= -DANDROID_NDK \
                    -DD​​ISABLE_IMPORTGL

    LOCAL_MODULE:=视频
    LOCAL_SRC_FILES:= VIDEO.C

    LOCAL_C_INCLUDES:= \
        $(LOCAL_PATH)/包括\
        $(LOCAL_PATH)/ ffmpeg的-的Andr​​oid / ffmpeg的\
        $(LOCAL_PATH)/ freetype的/有/对FreeType2 \
        $(LOCAL_PATH)/ freetype的/有\
        $(LOCAL_PATH)/ ftgl / src目录\
        $(LOCAL_PATH)/ ftgl
    LOCAL_LDLIBS:= -L $(NDK_PLATFORMS_ROOT)/ $(TARGET_PLATFORM)/弓臂/ usr / lib目录-L $(LOCAL_PATH)-L $(LOCAL_PATH)/ ffmpeg的-的Andr​​oid /编译/ ffmpeg的/ armeabi / lib中/ -lGLESv1_CM  - LDL -lavformat -lav codeC -lavfilter -lavutil -lswscale -llog -lz -lm

    包括$(BUILD_SHARED_LIBRARY)
 

和这里是我的NativeCalls.java:

 包com.nightscapecreations.anim3free;

    公共类NativeCalls {
        // ffmpeg的
        公共静态本地无效initVideo();
        公共静态本地无效loadVideo(字符串文件名); //
        公共静态本地无效prepareStorageFrame();
        公共静态本地无效的getFrame(); //
        公共静态本地无效freeConversionStorage();
        公共静态本地无效closeVideo(); //
        公共静态本地无效freeVideo(); //
        // OpenGL的
        公共静态本地无效的init preOpenGL(); //
        公共静态本地无效initOpenGL(); //
        公共静态本地无效并条机(); //
        公共静态本地无效closeOpenGL(); //
        公共静态本地无效closePostOpenGL(); //
        //墙纸
        公共静态本地无效updateVideoPosition();
        公共静态本地无效setSpanVideo(布尔B);
        //干将
        公共静态本地INT getVideoHeight();
        公共静态本地INT getVideoWidth();
        // setter方法
        公共静态本地无效setWallVideoDimensions(INT W,INT H);
        公共静态本地无效setWallDimensions(INT W,INT H);
        公共静态本地无效setScreenPadding(INT W,INT H);
        公共静态本地无效setVideoMargins(INT W,INT H);
        公共静态本地无效setDrawDimensions(INT drawWidth,INT drawHeight);
        公共静态本地无效setOffsets(INT X,int y)对;
        公共静态本地无效setSteps(INT XS,INT YS);
        公共静态本地无效setScreenDimensions(INT W,INT H);
        公共静态本地无效setTextureDimensions(INT TX,
                               INT TY);
        公共静态本地无效setOrientation(布尔B);
        公共静态本地无效套previewMode(布尔B);
        公共静态本地无效setTonality(INT T);
        公共静态本地无效toggleGetFrame(布尔B);
        // FPS
        公共静态本地无效setLoopVideo(布尔B);

        静态{
        的System.loadLibrary(avcore);
        的System.loadLibrary(avformat);
        的System.loadLibrary(AV codeC);
        //System.loadLibrary("avdevice);
        的System.loadLibrary(avfilter);
        的System.loadLibrary(avutil);
        的System.loadLibrary(swscale);
        的System.loadLibrary(视频);
        }

    }
 

修改

这是我的VIDEO.C文件的第一部分:

 的#include< GLES / gl.h>
    #包括< GLES / glext.h>

    #包括< GLES2 / gl2.h>
    #包括< GLES2 / gl2ext.h>

    #包括< stdlib.h中>
    #包括< time.h中>

    #包括< libav codeC / AV codec.h>
    #包括<了libavformat / avformat.h>
    #包括< libswscale / swscale.h>

    #包括< jni.h>
    #包括< string.h中>
    #包括< stdio.h中>
    #包括<安卓/ log.h>

    //#包括< FTGL / ftgl.h>

    // ffmpeg的视频变量
    INT initializedVideo = 0;
    INT initializedFrame = 0;
    AVFormatContext * pFormatCtx = NULL;
    INT视频流;
    AV codecContext * P codecCtx = NULL;
    AV codeC * P codeC = NULL;
    AVFrame * PFRAME = NULL;
    AVPacket包;
    INT frameFinished;
    浮动aspect_ratio;

    // ffmpeg的视频转换变量
    AVFrame * pFrameConverted = NULL;
    INT的numBytes;
    uint8_t有* bufferConverted = NULL;

    // OpenGL的
    INT textureFormat = PIX_FMT_RGBA; // PIX_FMT_RGBA PIX_FMT_RGB24
    INT GL_colorFormat = GL_RGBA; //必须匹配textureFormat指定的色彩
    INT textureWidth = 256;
    INT textureHeight = 256;
    INT nTextureHeight = -256;
    INT textureL = 0,变形机= 0,textureW = 0;
    INT frameTonality;

    // GLuint textureConverted = 0;
    GLuint texturesConverted [2] = {0,1};
    GLuint dummyTex = 2;
    静态INT的len = 0;


    静态为const char * BWVertexSrc =
             属性vec4 InVertex; \ N
             属性VEC2 InTexCoord0; \ N
             属性VEC2 InTexCoord1; \ N
             统一mat4 ProjectionModelviewMatrix; \ N
             不同VEC2 TEXCOORD0; \ N
             不同VEC2 TexCoord1; \ N

             无效的主要()\ N
             {\ N
             GL_POSITION = ProjectionModelviewMatrix * InVertex; \ N
             TEXCOORD0 = InTexCoord0; \ N
             TexCoord1 = InTexCoord1; \ N
             } \ N的;
    静态为const char * BWFragmentSrc =

             #version 110 \ N
             统一sampler2D Texture0; \ N
             统一sampler2D Texture1; \ N

             不同VEC2 TEXCOORD0; \ N
             不同VEC2 TexCoord1; \ N

             无效的主要()\ N
             {\ N
            VEC3色=的Texture2D(m_Texture,TEXCOORD).rgb; \ N
            浮灰=(color.r + color.g + color.b)/ 3.0; \ N
            VEC3灰度= VEC3(灰色); \ N

            gl_FragColor = vec4(灰度,1.0); \ N
             };
    静态GLuint shaderProgram;


    ////创建从TrueType字体文件位图字体。
    // FTGLPixmapFont字体(/ home / user中/ Arial.ttf);
    ////设置字体大小和渲染的小文本。
    //font.FaceSize(72);
    //font.Render("Hello世界)!;

    //屏幕尺寸
    INT屏幕宽度= 50;
    INT screenHeight = 50;
    INT screenL = 0,screenR = 0,screenW = 0;
    INT dPaddingX = 0,dPaddingY = 0;
    INT drawWidth = 50,drawHeight = 50;

    //墙纸
    INT wallWidth = 50;
    INT wallHeight = 50;
    INT xOffSet,yOffSet;
    INT XSTEP,yStep;
    jboolean spanVideo = JNI_TRUE;

    //视频尺寸
    INT wallVideoWidth = 0;
    INT wallVideoHeight = 0;
    INT marginX,marginY;
    jboolean isScreenPortrait = JNI_TRUE;
    jboolean是preVIEW = JNI_TRUE;
    jboolean loopVideo = JNI_TRUE;
    jboolean isGetFrame = JNI_TRUE;

    //文件
    为const char * szFileName;

    的#define最大值(A,B)(((a)及GT;(二))(一):(b))的
    的#define分钟(A,B)(((一)≤(b))的(一):(b))的

    //测试变量
    的#define RGBA8(R,G,B)(((R)&其中;≤(24))|((克)所述;≤(16))|((二)及所述;≤(8))| 255)
    INT sPixelsInited = JNI_FALSE;
    uint32_t的* s_pixels = NULL;

    INT s_pixels_size(){
      回报(的sizeof(uint32_t的)* textureWidth * textureHeight * 5);
    }

    无效render_pixels1(uint32_t的*像素,uint32_t的C){
        INT X,Y;
        / *填写5×5的s_x一个正方形,s_y * /
        对于(Y = 0; Y< textureHeight; Y ++){
            为(X = 0; X  - 其中; textureWidth; X ++){
                INT IDX = X + Y * textureWidth;
                像素[IDX ++] = RGBA8(255,255,0);
            }
        }
    }

    无效render_pixels2(uint32_t的*像素,uint32_t的C){
        INT X,Y;
        / *填写5×5的s_x一个正方形,s_y * /
        对于(Y = 0; Y< textureHeight; Y ++){
            为(X = 0; X  - 其中; textureWidth; X ++){
                INT IDX = X + Y * textureWidth;
                像素[IDX ++] = RGBA8(0,0,255);
            }
        }
    }

    无效Java_com_nightscapecreations_anim3free_NativeCalls_initVideo(JNIEnv的* ENV,jobject本){
        initializedVideo = 0;
        initializedFrame = 0;
    }

    / *的东西,得到加载列表:* /
    /* 缓冲 */
    / * pFrameConverted * /
    / * PFRAME * /
    / * P codecCtx * /
    / * pFormatCtx * /
    无效Java_com_nightscapecreations_anim3free_NativeCalls_loadVideo(JNIEnv的* ENV,jobject这个,的jstring文件名){
        jboolean isCopy;
        szFileName =(* ENV) - > GetStringUTFChars(ENV,文件名,和放大器; isCopy);
        //调试
        __android_log_print(ANDROID_LOG_DEBUGNDK:,NDK:LC:[%s]的,szFileName);
        //注册所有格式和codeCS
        av_register_all();
        //打开视频文件
        如果(av_open_input_file(安培;!pFormatCtx,szFileName,NULL,0,NULL)= 0){
        __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:无法打开文件);
        返回;
        }
        __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:成功与加载文件);
        //获取流信息* /
        如果(av_find_stream_info(pFormatCtx)℃,){
        __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:找不到流信息);
        返回;
        }
        __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:找到流信息);
        //找到所述第一视频数据流
        视频流= -1;
        INT I;
        对于(i = 0; I< pFormatCtx-> nb_streams;我++)
            如果(pFormatCtx->流[I]  - > codeC-> codec_type == codeC_TYPE_VIDEO){
                视频流= I;
                打破;
            }
        如果(视频流==  -  1){
            __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:没有找到一个视频流);
            返回;
        }
        __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:找到视频流);
        //获取一个指向codeC contetx视频流
        p codecCtx = pFormatCtx->流[视频流 - > codeC;
        //找到解codeR用于视频流
        p codeC = AV codec_find_de codeR(P codecCtx-> codec_id);
        如果(P codeC == NULL){
            __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:不支持codeC);
            返回;
        }
        //打开codeC
        如果(AV codec_open(P codecCtx,P codeC)小于0){
            __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:无法打开codeC);
            返回;
        }
        //分配视频帧(德codeD pre转换帧)
        PFRAME = AV codec_alloc_frame();
        //跟踪初始化
        initializedVideo = 1;
        __android_log_print(ANDROID_LOG_DEBUG,VIDEO.C,NDK:加载完成视频);
    }

    //这个工作,你需要首先设置缩放的视频尺寸
    无效Java_com_nightscapecreations_anim3free_NativeCalls_ prepareStorageFrame(JNIEnv的* ENV,jobject本){
        //分配一个AVFrame结构
        pFrameConverted = AV codec_alloc_frame();
        //确定所需的缓冲区的大小和分配缓冲区
        的numBytes = avpicture_get_size(textureFormat,textureWidth,textureHeight);
        bufferConverted =(uint8_t有*)av_malloc(*的numBytes的sizeof(uint8_t有));
        如果(pFrameConverted == NULL || bufferConverted == NULL)
            __android_log_print(ANDROID_LOG_DEBUG,prepareStorage>>>>中,内存不足);
        //分配缓冲区的适当部位在pFrameRGB中的图像平面
        //注意pFrameRGB中是一个AVFrame,但AVFrame是一个超集
        // AVPicture的
        avpicture_fill((AVPicture *)pFrameConverted,bufferConverted,textureFormat,textureWidth,textureHeight);
        __android_log_print(ANDROID_LOG_DEBUG,prepareStorage>>>>中,创建帧);
        __android_log_print(ANDROID_LOG_DEBUG,prepareStorage>>>>中,纹理尺寸:%DX%D,textureWidth,textureHeight);
        initializedFrame = 1;
    }

    jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoWidth(JNIEnv的* ENV,jobject本){
        返回p codecCtx->宽度;
    }

    jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoHeight(JNIEnv的* ENV,jobject本){
        返回p codecCtx->高度;
    }

    无效Java_com_nightscapecreations_anim3free_NativeCalls_getFrame(JNIEnv的* ENV,jobject本){
        //继续读数据包,直到我们打到最后还是找一个视频包
        而(av_read_frame(pFormatCtx,&安培;分组)GT; = 0){
            静态结构SwsContext * img_convert_ctx;
            //这是从视频流的分组?
            如果(packet.stream_index ==视频流){
                //德code视频帧
                / * __android_log_print(ANDROID_LOG_DEBUG,* /
                / *VIDEO.C,* /
                / *的getFrame:试着去code画幅的* /
                / *); * /
                AV codec_de code_video(P codecCtx,PFRAME,和放大器; frameFinished,packet.data,packet.size);
                //当时我们得到一个视频帧?
                如果(frameFinished){
                    如果(img_convert_ctx == NULL){
                        / *获取/设置比例中* /
                        INTW¯¯= P codecCtx->宽度;
                        INT ^ h = P codecCtx->高度;
                        img_convert_ctx = sws_getContext(W,H,P codecCtx-> pix_fmt,textureWidth,textureHeight,textureFormat,SWS_FAST_BILINEAR,NULL,NULL,NULL);
                        如果(img_convert_ctx == NULL){
                            返回;
                        }
                    }
                    / *如果IMG转换为空* /
                    / *最后缩放图像* /
                    / * __android_log_print(ANDROID_LOG_DEBUG,* /
                    / *VIDEO.C,* /
                    / *的getFrame:尝试将图像缩放* /
                    / *); * /

                    // pFrameConverted = PFRAME;
                    sws_scale(img_convert_ctx,pFrame->的数据,pFrame-> LINESIZE,0,P codecCtx->的高度,pFrameConverted->的数据,pFrameConverted-> LINESIZE);
                    // av_picture_crop(pFrameConverted->的数据,pFrame->的数据,1,P codecCtx->的高度,P codecCtx->宽度);
                    // av_picture_crop();
                    // avfilter_vf_crop();

                    / *做点什么pFrameConverted * /
                    / * ...查看并条机()* /
                    / *我们发现了一个视频帧,做了它,现在释放
                       文,并返回* /
                    av_free_packet(安培;包);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrame.age数:%d,pFrame->年龄);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrame.buffer_hints数:%d,pFrame-> buffer_hints);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrame.display_picture_number数:%d,pFrame-> display_picture_number);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrame.hwaccel_picture_private数:%d,pFrame-> hwaccel_picture_private);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrame.key_frame数:%d,pFrame-> key_frame);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrame.palette_has_changed数:%d,pFrame-> palette_has_changed);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrame.pict_type数:%d,pFrame-> pict_type);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrame.qscale_type数:%d,pFrame-> qscale_type);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrameConverted.age数:%d,pFrameConverted->年龄);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrameConverted.buffer_hints数:%d,pFrameConverted-> buffer_hints);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrameConverted.display_picture_number数:%d,pFrameConverted-> display_picture_number);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrameConverted.hwaccel_picture_private数:%d,pFrameConverted-> hwaccel_picture_private);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrameConverted.key_frame数:%d,pFrameConverted-> key_frame);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrameConverted.palette_has_changed数:%d,pFrameConverted-> palette_has_changed);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrameConverted.pict_type数:%d,pFrameConverted-> pict_type);
    // __android_log_print(ANDROID_LOG_INFO,Droid的调试,pFrameConverted.qscale_type数:%d,pFrameConverted-> qscale_type);
                    返回;
                } / *如果框架完成* /
            } / *如果数据包的视频流* /
            //释放用av_read_frame分配数据包
            av_free_packet(安培;包);
        } /* 而 */
        //当你到达终点重装视频
        函数av_seek_frame(pFormatCtx,视频流,0,AVSEEK_FLAG_ANY);
    }

    无效Java_com_nightscapecreations_anim3free_NativeCalls_setLoopVideo(JNIEnv的* ENV,jobject这样,jboolean B){
        loopVideo = B;
    }

    无效Java_com_nightscapecreations_anim3free_NativeCalls_closeVideo(JNIEnv的* ENV,jobject本){
        如果(initializedFrame == 1){
            //自由转换的图像
            av_free(bufferConverted);
            av_free(pFrameConverted);
            initializedFrame = 0;
            __android_log_print(ANDROID_LOG_DEBUG,closeVideo>>>>中,弗里德变换图像);
        }
        如果(initializedVideo == 1){
            / * //自由YUV帧* /
            av_free(PFRAME);
            / * //关闭codeC * /
            AV codec_close(P codecCtx);
            //关闭视频文件
            av_close_input_file(pFormatCtx);
            initializedVideo = 0;
            __android_log_print(ANDROID_LOG_DEBUG,closeVideo>>>>中,弗里德视频结构);
        }
    }

    无效Java_com_nightscapecreations_anim3free_NativeCalls_freeVideo(JNIEnv的* ENV,jobject本){
        如果(initializedVideo == 1){
            / * //自由YUV帧* /
            av_free(PFRAME);
            / * //关闭codeC * /
            AV codec_close(P codecCtx);
            //关闭视频文件
            av_close_input_file(pFormatCtx);
            __android_log_print(ANDROID_LOG_DEBUG,closeVideo>>>>中,弗里德视频结构);
            initializedVideo = 0;
        }
    }

    无效Java_com_nightscapecreations_anim3free_NativeCalls_freeConversionStorage(JNIEnv的* ENV,jobject本){
        如果(initializedFrame == 1){
            //自由转换的图像
            av_free(bufferConverted);
            av_freep(pFrameConverted);
            initializedFrame = 0;
        }
    }

    / * --- END视频---- * /

    / *禁用这些功能。 * /
    静态GLuint s_disable_options [] = {
        GL_FOG,
        GL_LIGHTING,
        GL_CULL_FACE,
        GL_ALPHA_TEST,
        GL_BLEND,
        GL_COLOR_LOGIC_OP,
        GL_DITHER,
        GL_STENCIL_TEST,
        GL_DEPTH_TEST,
        GL_COLOR_MATERIAL,
        0
    };

    //对于东西,OpenGL的需要一起工作,
    //像含纹理位图
    无效Java_com_nightscapecreations_anim3free_NativeCalls_init preOpenGL(JNIEnv的* ENV,jobject本){

    }
    ...
 

解决方案

如果你只希望再次使用前lib和没有任何编译与NDK,你可以简单地丢弃在所有的.so文件jniLibs /< ABI>

否则,作为你的NDK构建依赖于prebuilts,不能配置得当直接与摇篮的配置工作( NDK {} )。总之,随着NDK支持pcated现在去$ P $,最干净的方式,使工作是让摇篮电话NDK建造和使用您现有的Makefile:

 进口org.apache.tools.ant.taskdefs.condition.Os

...

安卓{
  ...
  sourceSets.main {
        jniLibs.srcDir钢骨混凝土/主/库'//设置jniLibs的.so文件位置的库,而不是
        jni.srcDirs = [] //禁用自动NDK建造通话
    }

    //添加调用定期NDK建造(.CMD)脚本的应用程序目录任务
    任务ndkBuild(类型:执行){
        如果(Os.isFamily(Os.FAMILY_WINDOWS)){
            命令行NDK-build.cmd','-C',文件(钢骨混凝土/主)。absolutePath
        } 其他 {
            命令行NDK建造','-C',文件(钢骨混凝土/主)。absolutePath
        }
    }

    //添加此任务,因为Java编译的依赖
    tasks.withType(JavaCompile){
        compileTask  - > compileTask.dependsOn ndkBuild
    }
}
 

I created a problem some years back based on https://ikaruga2.wordpress.com/2011/06/15/video-live-wallpaper-part-1/. My project was built in the version of Eclipse provided directly by Google at the time and worked fine with a copy of the compiled ffmpeg libraries created with my app name.

Now I'm trying to create a new app based on my old app. As Google no longer supports Eclipse I downloaded Android Studio and imported my project. With a few tweaks, I was able to successfully compile the old version of the project. So I modified the name, copied a new set of ".so" files into app\src\main\jniLibs\armeabi (where I assumed they should go) and tried running the application on my phone again with absolutely no other changes.

The NDK throws no errors. Gradle compiles the file without errors and installs it on my phone. The app appears in my live wallpapers list and I can click it to bring up the preview. But instead of a video appearing I receive and error and logCat reports:

02-26 21:50:31.164  18757-18757/? E/AndroidRuntime﹕ FATAL EXCEPTION: main
java.lang.ExceptionInInitializerError
        at com.nightscapecreations.anim3free.VideoLiveWallpaper.onSharedPreferenceChanged(VideoLiveWallpaper.java:165)
        at com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81)
        at android.app.ActivityThread.handleCreateService(ActivityThread.java:2273)
        at android.app.ActivityThread.access$1600(ActivityThread.java:127)
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1212)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at android.os.Looper.loop(Looper.java:137)
        at android.app.ActivityThread.main(ActivityThread.java:4441)
        at java.lang.reflect.Method.invokeNative(Native Method)
        at java.lang.reflect.Method.invoke(Method.java:511)
        at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:823)
        at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:590)
        at dalvik.system.NativeStart.main(Native Method)
 Caused by: java.lang.UnsatisfiedLinkError: Cannot load library: link_image[1936]:   144 could not load needed library '/data/data/com.nightscapecreations.anim1free/lib/libavutil.so' for 'libavcore.so' (load_library[1091]: Library '/data/data/com.nightscapecreations.anim1free/lib/libavutil.so' not found)
        at java.lang.Runtime.loadLibrary(Runtime.java:370)
        at java.lang.System.loadLibrary(System.java:535)
        at com.nightscapecreations.anim3free.NativeCalls.<clinit>(NativeCalls.java:64)
        at com.nightscapecreations.anim3free.VideoLiveWallpaper.onSharedPreferenceChanged(VideoLiveWallpaper.java:165)
        at com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81)
        at android.app.ActivityThread.handleCreateService(ActivityThread.java:2273)
        at android.app.ActivityThread.access$1600(ActivityThread.java:127)
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1212)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at android.os.Looper.loop(Looper.java:137)
        at android.app.ActivityThread.main(ActivityThread.java:4441)
        at java.lang.reflect.Method.invokeNative(Native Method)
        at java.lang.reflect.Method.invoke(Method.java:511)

I'm a novice Android/Java/C++ developer and am not sure what this error means, but Google leads me to believe that my new libraries are not being found. In my Eclipse project I had this set of libraries in "libs\armeabi", and another copy of them in a more complicated folder structure at "jni\ffmpeg-android\build\ffmpeg\armeabi\lib". Android Studio appears to have kept everything the same, other than renaming "libs" to "jniLibs", but I'm hitting a brick wall with this error and am unsure how to proceed.

How can I compile this new app with the new name using Android Studio?

In case it helps here is my Android.mk file:

    LOCAL_PATH := $(call my-dir)

    include $(CLEAR_VARS)
    MY_LIB_PATH := ffmpeg-android/build/ffmpeg/armeabi/lib
    LOCAL_MODULE := bambuser-libavcore
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcore.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavformat
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavformat.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavcodec
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcodec.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavfilter
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavfilter.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavutil
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavutil.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libswscale
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libswscale.so
    include $(PREBUILT_SHARED_LIBRARY)

    #local_PATH := $(call my-dir)

    include $(CLEAR_VARS)

    LOCAL_CFLAGS := -DANDROID_NDK \
                    -DDISABLE_IMPORTGL

    LOCAL_MODULE    := video
    LOCAL_SRC_FILES := video.c

    LOCAL_C_INCLUDES := \
        $(LOCAL_PATH)/include \
        $(LOCAL_PATH)/ffmpeg-android/ffmpeg \
        $(LOCAL_PATH)/freetype/include/freetype2 \
        $(LOCAL_PATH)/freetype/include \
        $(LOCAL_PATH)/ftgl/src \
        $(LOCAL_PATH)/ftgl
    LOCAL_LDLIBS := -L$(NDK_PLATFORMS_ROOT)/$(TARGET_PLATFORM)/arch-arm/usr/lib -L$(LOCAL_PATH) -L$(LOCAL_PATH)/ffmpeg-android/build/ffmpeg/armeabi/lib/ -lGLESv1_CM -ldl -lavformat -lavcodec -lavfilter -lavutil -lswscale -llog -lz -lm

    include $(BUILD_SHARED_LIBRARY)

And here is my NativeCalls.java:

    package com.nightscapecreations.anim3free;

    public class NativeCalls {
        //ffmpeg
        public static native void initVideo();
        public static native void loadVideo(String fileName); //
        public static native void prepareStorageFrame();
        public static native void getFrame(); //
        public static native void freeConversionStorage();
        public static native void closeVideo();//
        public static native void freeVideo();//
        //opengl
        public static native void initPreOpenGL(); //
        public static native void initOpenGL(); //
        public static native void drawFrame(); //
        public static native void closeOpenGL(); //
        public static native void closePostOpenGL();//
        //wallpaper
        public static native void updateVideoPosition();
        public static native void setSpanVideo(boolean b);
        //getters
        public static native int getVideoHeight();
        public static native int getVideoWidth();
        //setters
        public static native void setWallVideoDimensions(int w,int h);
        public static native void setWallDimensions(int w,int h);
        public static native void setScreenPadding(int w,int h);
        public static native void setVideoMargins(int w,int h);
        public static native void setDrawDimensions(int drawWidth,int drawHeight);
        public static native void setOffsets(int x,int y);
        public static native void setSteps(int xs,int ys);
        public static native void setScreenDimensions(int w, int h);
        public static native void setTextureDimensions(int tx,
                               int ty );
        public static native void setOrientation(boolean b);
        public static native void setPreviewMode(boolean b);
        public static native void setTonality(int t);
        public static native void toggleGetFrame(boolean b);
        //fps
        public static native void setLoopVideo(boolean b);

        static {
        System.loadLibrary("avcore");
        System.loadLibrary("avformat");
        System.loadLibrary("avcodec");
        //System.loadLibrary("avdevice");
        System.loadLibrary("avfilter");
        System.loadLibrary("avutil");
        System.loadLibrary("swscale");
        System.loadLibrary("video");
        }

    }

EDIT

This is the first part of my video.c file:

    #include <GLES/gl.h>
    #include <GLES/glext.h>

    #include <GLES2/gl2.h>
    #include <GLES2/gl2ext.h>

    #include <stdlib.h>
    #include <time.h>

    #include <libavcodec/avcodec.h>
    #include <libavformat/avformat.h>
    #include <libswscale/swscale.h>

    #include <jni.h>  
    #include <string.h>  
    #include <stdio.h>
    #include <android/log.h>

    //#include <FTGL/ftgl.h>

    //ffmpeg video variables
    int      initializedVideo=0;
    int      initializedFrame=0;
    AVFormatContext *pFormatCtx=NULL;
    int             videoStream;
    AVCodecContext  *pCodecCtx=NULL;
    AVCodec         *pCodec=NULL;
    AVFrame         *pFrame=NULL;
    AVPacket        packet;
    int             frameFinished;
    float           aspect_ratio;

    //ffmpeg video conversion variables
    AVFrame         *pFrameConverted=NULL;
    int             numBytes;
    uint8_t         *bufferConverted=NULL;

    //opengl
    int textureFormat=PIX_FMT_RGBA; // PIX_FMT_RGBA   PIX_FMT_RGB24
    int GL_colorFormat=GL_RGBA; // Must match the colorspace specified for textureFormat
    int textureWidth=256;
    int textureHeight=256;
    int nTextureHeight=-256;
    int textureL=0, textureR=0, textureW=0;
    int frameTonality;

    //GLuint textureConverted=0;
    GLuint texturesConverted[2] = { 0,1 };
    GLuint dummyTex = 2;
    static int len=0;


    static const char* BWVertexSrc =
             "attribute vec4 InVertex;\n"
             "attribute vec2 InTexCoord0;\n"
             "attribute vec2 InTexCoord1;\n"
             "uniform mat4 ProjectionModelviewMatrix;\n"
             "varying vec2 TexCoord0;\n"
             "varying vec2 TexCoord1;\n"

             "void main()\n"
             "{\n"
             "  gl_Position = ProjectionModelviewMatrix * InVertex;\n"
             "  TexCoord0 = InTexCoord0;\n"
             "  TexCoord1 = InTexCoord1;\n"
             "}\n";
    static const char* BWFragmentSrc  =

             "#version 110\n"
             "uniform sampler2D Texture0;\n"
             "uniform sampler2D Texture1;\n"

             "varying vec2 TexCoord0;\n"
             "varying vec2 TexCoord1;\n"

             "void main()\n"
             "{\n"
            "   vec3 color = texture2D(m_Texture, texCoord).rgb;\n"
            "   float gray = (color.r + color.g + color.b) / 3.0;\n"
            "   vec3 grayscale = vec3(gray);\n"

            "   gl_FragColor = vec4(grayscale, 1.0);\n"
             "}";
    static GLuint shaderProgram;


    //// Create a pixmap font from a TrueType file.
    //FTGLPixmapFont font("/home/user/Arial.ttf");
    //// Set the font size and render a small text.
    //font.FaceSize(72);
    //font.Render("Hello World!");

    //screen dimensions
    int screenWidth = 50;
    int screenHeight= 50;
    int screenL=0, screenR=0, screenW=0;
    int dPaddingX=0,dPaddingY=0;
    int drawWidth=50,drawHeight=50;

    //wallpaper
    int wallWidth = 50;
    int wallHeight = 50;
    int xOffSet, yOffSet;
    int xStep, yStep;
    jboolean spanVideo = JNI_TRUE;

    //video dimensions
    int wallVideoWidth = 0;
    int wallVideoHeight = 0;
    int marginX, marginY;
    jboolean isScreenPortrait = JNI_TRUE;
    jboolean isPreview = JNI_TRUE;
    jboolean loopVideo = JNI_TRUE;
    jboolean isGetFrame = JNI_TRUE;

    //file
    const char * szFileName;

    #define max( a, b ) ( ((a) > (b)) ? (a) : (b) )
    #define min( a, b ) ( ((a) < (b)) ? (a) : (b) )

    //test variables
    #define RGBA8(r, g, b)  (((r) << (24)) | ((g) << (16)) | ((b) << (8)) | 255)
    int sPixelsInited=JNI_FALSE;
    uint32_t *s_pixels=NULL;

    int s_pixels_size() { 
      return (sizeof(uint32_t) * textureWidth * textureHeight * 5); 
    }

    void render_pixels1(uint32_t *pixels, uint32_t c) {
        int x, y;
        /* fill in a square of 5 x 5 at s_x, s_y */
        for (y = 0; y < textureHeight; y++) {
            for (x = 0; x < textureWidth; x++) {
                int idx = x + y * textureWidth;
                pixels[idx++] = RGBA8(255, 255, 0);
            }
        }
    }

    void render_pixels2(uint32_t *pixels, uint32_t c) {
        int x, y;
        /* fill in a square of 5 x 5 at s_x, s_y */
        for (y = 0; y < textureHeight; y++) {
            for (x = 0; x < textureWidth; x++) {
                int idx = x + y * textureWidth;
                pixels[idx++] = RGBA8(0, 0, 255);
            }
        }
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_initVideo (JNIEnv * env, jobject this) {
        initializedVideo = 0;
        initializedFrame = 0;
    }

    /* list of things that get loaded: */
    /* buffer */
    /* pFrameConverted */
    /* pFrame */
    /* pCodecCtx */
    /* pFormatCtx */
    void Java_com_nightscapecreations_anim3free_NativeCalls_loadVideo (JNIEnv * env, jobject this, jstring fileName)  {
        jboolean isCopy;
        szFileName = (*env)->GetStringUTFChars(env, fileName, &isCopy);
        //debug
        __android_log_print(ANDROID_LOG_DEBUG, "NDK: ", "NDK:LC: [%s]", szFileName);
        // Register all formats and codecs
        av_register_all();
        // Open video file
        if(av_open_input_file(&pFormatCtx, szFileName, NULL, 0, NULL)!=0) {
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Couldn't open file");
        return;
        }
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Succesfully loaded file");
        // Retrieve stream information */
        if(av_find_stream_info(pFormatCtx)<0) {
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Couldn't find stream information");
        return;
        }
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Found stream info");
        // Find the first video stream
        videoStream=-1;
        int i;
        for(i=0; i<pFormatCtx->nb_streams; i++)
            if(pFormatCtx->streams[i]->codec->codec_type==CODEC_TYPE_VIDEO) {
                videoStream=i;
                break;
            }
        if(videoStream==-1) {
            __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Didn't find a video stream");
            return;
        }
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Found video stream");
        // Get a pointer to the codec contetx for the video stream
        pCodecCtx=pFormatCtx->streams[videoStream]->codec;
        // Find the decoder for the video stream
        pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
        if(pCodec==NULL) {
            __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Unsupported codec");
            return;
        }
        // Open codec
        if(avcodec_open(pCodecCtx, pCodec)<0) {
            __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Could not open codec");
            return;
        }
        // Allocate video frame (decoded pre-conversion frame)
        pFrame=avcodec_alloc_frame();
        // keep track of initialization
        initializedVideo = 1;
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Finished loading video");
    }

    //for this to work, you need to set the scaled video dimensions first
    void Java_com_nightscapecreations_anim3free_NativeCalls_prepareStorageFrame (JNIEnv * env, jobject this)  {
        // Allocate an AVFrame structure
        pFrameConverted=avcodec_alloc_frame();
        // Determine required buffer size and allocate buffer
        numBytes=avpicture_get_size(textureFormat, textureWidth, textureHeight);
        bufferConverted=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
        if ( pFrameConverted == NULL || bufferConverted == NULL )
            __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "Out of memory");
        // Assign appropriate parts of buffer to image planes in pFrameRGB
        // Note that pFrameRGB is an AVFrame, but AVFrame is a superset
        // of AVPicture
        avpicture_fill((AVPicture *)pFrameConverted, bufferConverted, textureFormat, textureWidth, textureHeight);
        __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "Created frame");
        __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "texture dimensions: %dx%d", textureWidth, textureHeight);
        initializedFrame = 1;
    }

    jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoWidth (JNIEnv * env, jobject this)  {
        return pCodecCtx->width;
    }

    jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoHeight (JNIEnv * env, jobject this)  {
        return pCodecCtx->height;
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_getFrame (JNIEnv * env, jobject this)  {
        // keep reading packets until we hit the end or find a video packet
        while(av_read_frame(pFormatCtx, &packet)>=0) {
            static struct SwsContext *img_convert_ctx;
            // Is this a packet from the video stream?
            if(packet.stream_index==videoStream) {
                // Decode video frame
                /* __android_log_print(ANDROID_LOG_DEBUG,  */
                /*            "video.c",  */
                /*            "getFrame: Try to decode frame" */
                /*            ); */
                avcodec_decode_video(pCodecCtx, pFrame, &frameFinished, packet.data, packet.size);
                // Did we get a video frame?
                if(frameFinished) {
                    if(img_convert_ctx == NULL) {
                        /* get/set the scaling context */
                        int w = pCodecCtx->width;
                        int h = pCodecCtx->height;
                        img_convert_ctx = sws_getContext(w, h, pCodecCtx->pix_fmt, textureWidth,textureHeight, textureFormat, SWS_FAST_BILINEAR, NULL, NULL, NULL);
                        if(img_convert_ctx == NULL) {
                            return;
                        }
                    }
                    /* if img convert null */
                    /* finally scale the image */
                    /* __android_log_print(ANDROID_LOG_DEBUG,  */
                    /*          "video.c",  */
                    /*          "getFrame: Try to scale the image" */
                    /*          ); */

                    //pFrameConverted = pFrame;
                    sws_scale(img_convert_ctx, pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameConverted->data, pFrameConverted->linesize);
                    //av_picture_crop(pFrameConverted->data, pFrame->data, 1, pCodecCtx->height, pCodecCtx->width);
                    //av_picture_crop();
                    //avfilter_vf_crop();

                    /* do something with pFrameConverted */
                    /* ... see drawFrame() */
                    /* We found a video frame, did something with it, now free up
                       packet and return */
                    av_free_packet(&packet);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.age: %d", pFrame->age);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.buffer_hints: %d", pFrame->buffer_hints);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.display_picture_number: %d", pFrame->display_picture_number);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.hwaccel_picture_private: %d", pFrame->hwaccel_picture_private);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.key_frame: %d", pFrame->key_frame);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.palette_has_changed: %d", pFrame->palette_has_changed);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.pict_type: %d", pFrame->pict_type);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.qscale_type: %d", pFrame->qscale_type);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.age: %d", pFrameConverted->age);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.buffer_hints: %d", pFrameConverted->buffer_hints);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.display_picture_number: %d", pFrameConverted->display_picture_number);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.hwaccel_picture_private: %d", pFrameConverted->hwaccel_picture_private);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.key_frame: %d", pFrameConverted->key_frame);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.palette_has_changed: %d", pFrameConverted->palette_has_changed);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.pict_type: %d", pFrameConverted->pict_type);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.qscale_type: %d", pFrameConverted->qscale_type);
                    return;
                } /* if frame finished */
            } /* if packet video stream */
            // Free the packet that was allocated by av_read_frame
            av_free_packet(&packet);
        } /* while */
        //reload video when you get to the end
        av_seek_frame(pFormatCtx,videoStream,0,AVSEEK_FLAG_ANY);
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_setLoopVideo (JNIEnv * env, jobject this, jboolean b) {
        loopVideo = b;
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_closeVideo (JNIEnv * env, jobject this) {
        if ( initializedFrame == 1 ) {
            // Free the converted image
            av_free(bufferConverted);
            av_free(pFrameConverted);
            initializedFrame = 0;
            __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed converted image");
        }
        if ( initializedVideo == 1 ) {
            /* // Free the YUV frame */
            av_free(pFrame);
            /* // Close the codec */
            avcodec_close(pCodecCtx);
            // Close the video file
            av_close_input_file(pFormatCtx);
            initializedVideo = 0;
            __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed video structures");
        }
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_freeVideo (JNIEnv * env, jobject this) {
        if ( initializedVideo == 1 ) {
            /* // Free the YUV frame */
            av_free(pFrame);
            /* // Close the codec */
            avcodec_close(pCodecCtx);
            // Close the video file
            av_close_input_file(pFormatCtx);
            __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed video structures");
            initializedVideo = 0;
        }
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_freeConversionStorage (JNIEnv * env, jobject this) {
        if ( initializedFrame == 1 ) {
            // Free the converted image
            av_free(bufferConverted);
            av_freep(pFrameConverted);
            initializedFrame = 0;
        }
    }

    /*--- END OF VIDEO ----*/

    /* disable these capabilities. */
    static GLuint s_disable_options[] = {
        GL_FOG,
        GL_LIGHTING,
        GL_CULL_FACE,
        GL_ALPHA_TEST,
        GL_BLEND,
        GL_COLOR_LOGIC_OP,
        GL_DITHER,
        GL_STENCIL_TEST,
        GL_DEPTH_TEST,
        GL_COLOR_MATERIAL,
        0
    };

    // For stuff that opengl needs to work with,
    // like the bitmap containing the texture
    void Java_com_nightscapecreations_anim3free_NativeCalls_initPreOpenGL (JNIEnv * env, jobject this)  {

    }
    ...

解决方案

if you only want to reuse your former lib and not compiling anything with the NDK, you can simply drop all your .so files inside jniLibs/<abi>.

Else, as your ndk build depend on prebuilts, you can't configure it properly to work directly with gradle configuration (ndk{}). Anyway, as the ndk support is deprecated for now, the most clean way to make it work is to make gradle call ndk-build and use your existing Makefiles:

import org.apache.tools.ant.taskdefs.condition.Os

...

android {  
  ...
  sourceSets.main {
        jniLibs.srcDir 'src/main/libs' //set .so files location to libs instead of jniLibs
        jni.srcDirs = [] //disable automatic ndk-build call
    }

    // add a task that calls regular ndk-build(.cmd) script from app directory
    task ndkBuild(type: Exec) {
        if (Os.isFamily(Os.FAMILY_WINDOWS)) {
            commandLine 'ndk-build.cmd', '-C', file('src/main').absolutePath
        } else {
            commandLine 'ndk-build', '-C', file('src/main').absolutePath
        }
    }

    // add this task as a dependency of Java compilation
    tasks.withType(JavaCompile) {
        compileTask -> compileTask.dependsOn ndkBuild
    }
}

这篇关于如何在Android的Studio中使用原生的C库的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆