关于Android的Mediacodec速度问题和瓶颈 [英] Regarding Android's Mediacodec speed concerns and bottlenecks

查看:1080
本文介绍了关于Android的Mediacodec速度问题和瓶颈的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

目前,我使用Mediacodec编码由90帧组成的30 FPS视频需要2100-2400毫秒. 我正在使用此处找到的代码,除了generateSurfaceFrame(i)部分已被替换与:

Currently, my encoding a 30 FPS video consisting of 90 frames using Mediacodec takes 2100-2400 ms. I'm using the code found here, except with the generateSurfaceFrame(i) part being replaced with:

private void generateFrame(Bitmap bitmap, Rect source)
    {
        long drawFrameStartTime = System.currentTimeMillis();
        Canvas canvas = mInputSurface.lockCanvas(null);
        canvas.drawRect(0, 0, mSquareDim, mSquareDim, clearPaint);
        //Process the canvas below
        try
        {
            canvas.drawBitmap(bitmap, source, new Rect(0, 0, mSquareDim, mSquareDim), antiAliasPaint);
        }
        //Process the canvas above
        catch(Exception e) {Log.e("renderExc", e.toString());}
        finally {mInputSurface.unlockCanvasAndPost(canvas);}
        long drawFrameEndTime = System.currentTimeMillis();
        Log.i("frame_draw_time", (drawFrameEndTime - drawFrameStartTime)+"");
    }

然后使用从

And the putting the frames into the MediaMuxer part with the code found and adapted from here - the one using the CircularBuffer class from Grafika. The muxer had to be released independently from the rest using that code.

不过,在速度方面,我仍然担心Mediacodec的其他瓶颈,目前我的目标是API 18(最低要求).我的问题是:

I'm still concerned about Mediacodec's other bottlenecks when it comes to speed, though, and I'm targeting API 18 (minimum) at the moment. My questions are:

  1. 我应该开始使用异步模式,它比同步模式快多少?
  2. 使用OpenGL绘制帧是否比上述的Surface-Canvas方法更快?
  3. 我应该关注Mediacodec中的其他瓶颈吗?

在询问时将提供完整的源代码.

Full source code will be provided when asked.

推荐答案

@mstorsjo达到了最高点.您可以在Grafika中找到基于GLES的视频生成示例,例如 MovieEightRects 使用 GeneratedMovie 助手类.

@mstorsjo hit the high points. You can find an example of GLES-based video generation in Grafika, e.g. MovieEightRects uses the GeneratedMovie helper class.

如果您测量端到端视频编码时间,则将同时测量吞吐量和延迟. MediaCodec通过IPC与一个单独的进程(媒体服务器)进行对话,该进程必须通过OMX驱动程序分配硬件资源.这需要一些时间才能预热,并且有一些延迟量将帧推入编解码器.

If you measure the end-to-end video encoding time you will be measuring both throughput and latency. MediaCodec talks to a separate process (mediaserver) through IPC, which has to allocate hardware resources through an OMX driver. It takes a little time for this to warm up, and there's some amount of latency shoving frames through the codec.

更快地生成帧不会影响整体编码速度,只要您生成的速度与编码器可以编码的速度一样即可.向MediaMuxer发送数据时偶尔出现的停顿现象会堵塞管道,因此也会出现Horizo​​n Camera博客文章,因此,您可以为此担心(尤其是当源代码在编码管道停顿时掉帧).

Generating frames faster won't affect the overall encoding speed so long as you're generating as fast as the encoder can encode. The occasional stall when sending data to MediaMuxer will plug up the pipeline, hence the Horizon Camera blog post, so it's reasonable to worry about that (especially if your source drops frames when the encoding pipeline stalls).

这篇关于关于Android的Mediacodec速度问题和瓶颈的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆