从Stagefright德codeR本地窗口函数输出queueBuffer未呈现 [英] Native window queueBuffer function not rendering output from Stagefright decoder

查看:572
本文介绍了从Stagefright德codeR本地窗口函数输出queueBuffer未呈现的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我传递一个SurfaceView面从Java到JNI,我获得来自表面的本机窗口。 Stagefright 日从MP4文件codeS H264帧。在解码过程中,我称之为 ANativeWindow :: queueBuffer(),以便发送去codeD帧渲染。上有解码或致电没有错误 queueBuffer(),我得到的是一个黑色的屏幕。

我真的觉得我没有设置本机窗口中正常,这样,当 queueBuffer()被调用时,它呈现在屏幕上。不过,我可以直接通过memcpy的渲染像素的本机窗口。不幸的是,我实例化后的 OMXClient 试图手动绘制像素时,如此看来,我必须使用发生段错误 queueBuffer()

我在的onCreate()surfaceview被设置:

 保护无效的onCreate(包savedInstanceState){
    super.onCreate(savedInstanceState);

    SurfaceView surfaceView =新SurfaceView(本);
    。surfaceView.getHolder()的addCallback(本);
    的setContentView(surfaceView);
}
 

一旦表面被创建,我把我的原生的init()与表面功能:

  @覆盖
公共无效surfaceChanged(SurfaceHolder持有人,INT格式,诠释的宽度,高度INT){
    NativeLib.init(holder.getSurface(),宽度,高度);
}
 

本机窗口处于JNI创建和去code线程启动:

 的nativeWindow = ANativeWindow_fromSurface(ENV,面);
INT RET =在pthread_create(安培;去code_thread,NULL,&安培;去code_frames,NULL);
 

我的解码程序帧一拉的 vec.io的Stagefright解码例如

 无效*德code_frames(无效*){
    mNativeWindow =的nativeWindow;
    SP< MediaSource的> mVideoSource =新AVFormatSource();
    OMXClient mClient;
    mClient.connect();

    SP< MediaSource的> mVideoDe codeR = OMX codeC:创建(mClient.interface(),mVideoSource->的getFormat(),假的,mVideoSource,NULL,0,mNativeWindow);
    mVideoDe codeR->开始();

    同时,(犯错!= ERROR_END_OF_STREAM){
        MediaBuffer * mVideoBuffer;
        MediaSource的:: ReadOptions选项;
        ERR = mVideoDe codeR->阅读(安培; mVideoBuffer,和放大器;选项);

        如果(ERR == OK){
            如果(mVideoBuffer-> range_length()大于0){

                SP<元数据> METADATA = mVideoBuffer-> meta_data();
                的int64_t timeUs = 0;
                metaData-> findInt64(kKeyTime,和放大器; timeUs);
                status_t ERR1 = native_window_set_buffers_timestamp(mNativeWindow.get(),timeUs * 1000);
                //此行导致黑框
                status_t ERR2 = mNativeWindow-> queueBuffer(mNativeWindow.get(),mVideoBuffer-> graphicBuffer()得到(),-1);

                如果(ERR2 == 0){
                    metaData-> setInt32(kKeyRendered,1);
                }
            }
            mVideoBuffer->释放();
        }
    }
mVideoSource.clear();
mVideoDe codeR->停止();
mVideoDe coder.clear();
mClient.disconnect();
}
 

修改:以加尼甚的意见,我为了改变色彩空间连接该真棒渲染。在此很明显,彩色格式没有被在Stagefright设置

  08-06 00:56:32.842:A / SoftwareRenderer(7326):框架/ AV /媒体/ libstagefright / colorconversion / SoftwareRenderer.cpp:42 CHECK(超常> findInt32 (kKeyColorFormat,和放大器; TMP))失败。
08-06 00:56:32.842:A / libc的(7326):致命的信号11(SIGSEGV)在0xdeadbaad(code = 1),螺纹7340(hieu.alloclient)
 

试图明确设置色彩空间(kKeyColorFormat到YUV420P色彩空间),导致出队的问题。这可能是有道理的,因为颜色格式我指定是任意的。

  08-06 00:44:30.878:V / OMX codeC(6937):matchComponentName(空)
08-06 00:44:30.888:V / OMX codeC(6937):匹配'OMX.qcom.video.de coder.avc'怪癖0x000000a8
08-06 00:44:30.888:V / OMX codeC(6937):matchComponentName(空)
08-06 00:44:30.888:V / OMX codeC(6937):匹配'OMX.google.h264.de codeR'怪癖00000000
08-06 00:44:30.888:V / OMX codeC(6937):试图分配OMX节点的OMX.qcom.video.de coder.avc
08-06 00:44:30.918:V / OMX codeC(6937):成功分配OMX节点OMX.qcom.video.de coder.avc
08-06 00:44:30.918:V / OMX codeC(6937):配置codeC保护= 0
08-06 00:44:30.918:I / OMX codeC(6937):OMX.qcom.video.de coder.avc] AVC配置= 66(基线),水平= 13
08-06 00:44:30.918:V / OMX codeC(6937):OMX.qcom.video.de coder.avc] setVideoOutputFormat宽度= 320,高度= 240
08-06 00:44:30.918:V / OMX codeC(6937):OMX.qcom.video.de coder.avc] portIndex:0,指数:0,ECOM pressionFormat = 7 eColorFormat = 0
08-06 00:44:30.918:V / OMX codeC(6937):OMX.qcom.video.de coder.avc]找到一个匹配。
08-06 00:44:30.938:I / QCOMX codeC(6937):德codeR应该是在任意模式
08-06 00:44:30.958:I / OMX codeC(6937):OMX.qcom.video.de coder.avc]视频尺寸为320×240
08-06 00:44:30.958:I / OMX codeC(6937):OMX.qcom.video.de coder.avc]作物矩形是320×240 @(0,0)
08-06 00:44:30.958:D / infoJNI(6937):之前启动
08-06 00:44:30.968:V / OMX codeC(6937):OMX.qcom.video.de coder.avc] 2分配缓冲区大小2097088输入端口
08-06 00:44:30.968:V / OMX codeC(6937):OMX.qcom.video.de coder.avc]输入端口分配的缓冲区0x417037d8
08-06 00:44:30.968:V / OMX codeC(6937):OMX.qcom.video.de coder.avc]输入端口分配的缓冲区0x41703828
08-06 00:44:30.978:V / OMX codeC(6937):native_window_set_usage使用率= 0x40000000的
08-06 00:44:30.978:V / OMX codeC(6937):OMX.qcom.video.de coder.avc] 22分配从缓冲区大小147456输出端口的本地窗口
08-06 00:44:30.978:E / OMX codeC(6937):dequeueBuffer失败:无效的参数(22)
 

解决方案

我结束了使用Java低级别API,而不是解决这一问题。我成立了一个原生 read_frame 函数中使用 FFmpeg的解析视频帧。我调用这个函数在一个单独的Java德codeR线程,由返回数据的新的帧是德codeD媒体codeC 。这是非常简单的呈现此方式 - 只是通过媒体codeC 表面。

另外,我也可以使用 MediaExtractor ,而 FFmpeg的不得不说,我需要一些其他的功能。

I'm passing a SurfaceView surface from Java to JNI where I obtain the native window from that surface. Stagefright decodes h264 frames from an mp4 file. During the decoding process I call ANativeWindow::queueBuffer() in order to send decoded frames to be rendered. There are no errors on decoding or on calling queueBuffer(), all I get is a black screen.

I really feel like I'm not setting up the native window properly so that when queueBuffer() is called, it is rendered to the screen. However, I can render pixels to the native window directly via memcpy. Unfortunately, after I instantiate the OMXClient a segfault occurs when trying to manually draw pixels, so it seems I must use queueBuffer().

My surfaceview being setup in onCreate():

protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);

    SurfaceView surfaceView = new SurfaceView(this);
    surfaceView.getHolder().addCallback(this);
    setContentView(surfaceView);
}    

Once the surface is created, I call my native init() function with the surface:

@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    NativeLib.init(holder.getSurface(), width, height);
}

The native window is created in JNI and a decode thread is started:

nativeWindow = ANativeWindow_fromSurface(env, surface);
int ret = pthread_create(&decode_thread, NULL, &decode_frames, NULL);

My routine for decoding frames a la vec.io's Stagefright decoding example

void* decode_frames(void*){
    mNativeWindow = nativeWindow;
    sp<MediaSource> mVideoSource = new AVFormatSource();
    OMXClient mClient;
    mClient.connect();

    sp<MediaSource> mVideoDecoder = OMXCodec::Create(mClient.interface(), mVideoSource->getFormat(), false, mVideoSource, NULL, 0, mNativeWindow);
    mVideoDecoder->start();

    while(err != ERROR_END_OF_STREAM ) {
        MediaBuffer *mVideoBuffer;
        MediaSource::ReadOptions options;
        err = mVideoDecoder->read(&mVideoBuffer, &options);

        if (err == OK) {
            if (mVideoBuffer->range_length() > 0) {

                sp<MetaData> metaData = mVideoBuffer->meta_data();
                int64_t timeUs = 0;
                metaData->findInt64(kKeyTime, &timeUs);
                status_t err1 = native_window_set_buffers_timestamp(mNativeWindow.get(), timeUs * 1000);
                //This line results in a black frame
                status_t err2 = mNativeWindow->queueBuffer(mNativeWindow.get(), mVideoBuffer->graphicBuffer().get(), -1); 

                if (err2 == 0) {
                    metaData->setInt32(kKeyRendered, 1);
                }
            } 
            mVideoBuffer->release();
        }
    }
mVideoSource.clear();
mVideoDecoder->stop();
mVideoDecoder.clear();
mClient.disconnect();
}

EDIT: Taking Ganesh's advice, I interfaced with the Awesome Renderer in order to change color space. During this it became apparent that the color format wasn't being set in Stagefright.

08-06 00:56:32.842: A/SoftwareRenderer(7326): frameworks/av/media/libstagefright/colorconversion/SoftwareRenderer.cpp:42 CHECK(meta->findInt32(kKeyColorFormat, &tmp)) failed.
08-06 00:56:32.842: A/libc(7326): Fatal signal 11 (SIGSEGV) at 0xdeadbaad (code=1), thread 7340 (hieu.alloclient)

Trying to set the color space explicitly (kKeyColorFormat to a yuv420P color space) leads to a dequeue problem. Which probably makes sense because the color format I specify is arbitrary.

08-06 00:44:30.878: V/OMXCodec(6937): matchComponentName (null)
08-06 00:44:30.888: V/OMXCodec(6937): matching 'OMX.qcom.video.decoder.avc' quirks 0x000000a8
08-06 00:44:30.888: V/OMXCodec(6937): matchComponentName (null) 
08-06 00:44:30.888: V/OMXCodec(6937): matching 'OMX.google.h264.decoder' quirks 0x00000000
08-06 00:44:30.888: V/OMXCodec(6937): Attempting to allocate OMX node 'OMX.qcom.video.decoder.avc'
08-06 00:44:30.918: V/OMXCodec(6937): Successfully allocated OMX node 'OMX.qcom.video.decoder.avc'
08-06 00:44:30.918: V/OMXCodec(6937): configureCodec protected=0
08-06 00:44:30.918: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] AVC profile = 66 (Baseline), level = 13
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] setVideoOutputFormat width=320, height=240
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] portIndex: 0, index: 0, eCompressionFormat=7 eColorFormat=0
08-06 00:44:30.918: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] found a match.
08-06 00:44:30.938: I/QCOMXCodec(6937): Decoder should be in arbitrary mode
08-06 00:44:30.958: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] video dimensions are 320 x 240
08-06 00:44:30.958: I/OMXCodec(6937): [OMX.qcom.video.decoder.avc] Crop rect is 320 x 240 @ (0, 0)
08-06 00:44:30.958: D/infoJNI(6937): before started
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocating 2 buffers of size 2097088 on input port
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocated buffer 0x417037d8 on input port
08-06 00:44:30.968: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocated buffer 0x41703828 on input port
08-06 00:44:30.978: V/OMXCodec(6937): native_window_set_usage usage=0x40000000
08-06 00:44:30.978: V/OMXCodec(6937): [OMX.qcom.video.decoder.avc] allocating 22 buffers from a native window of size 147456 on output port
08-06 00:44:30.978: E/OMXCodec(6937): dequeueBuffer failed: Invalid argument (22)

解决方案

I ended up solving this issue by using the Java low level APIs instead. I set up a native read_frame function that parses video frames using FFmpeg. I call this function in a separate Java decoder thread, which returns a new frame of data to be decoded by MediaCodec. It was very straight forward to render this way-- just pass MediaCodec the surface.

Alternatively, I could have used MediaExtractor, but FFmpeg had some other functionality that I needed.

这篇关于从Stagefright德codeR本地窗口函数输出queueBuffer未呈现的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆