媒体codec.dequeueOutputBuffer走很长的Andr​​oid上的H264编码时 [英] MediaCodec.dequeueOutputBuffer taking very long when encoding h264 on Android

查看:3294
本文介绍了媒体codec.dequeueOutputBuffer走很长的Andr​​oid上的H264编码时的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图连接在Android code H264视频实时视频使用Media codeC流,但dequeueOutputBuffer一直采取非常长(实际上它的速度非常快,但有时在其他时间很慢,看到日志下面的输出)。我已经看到了它,甚至去到200毫秒的输出缓冲准备就绪。是不是我做错了我的code或者你认为这是与OMX.Nvidia.h264.en codeR问题?

也许我需要从1280×720下采样图像以更小的东西?或者,也许我需要出队和队列多输入缓冲器,而我等待输出缓冲区? (有6个输入和6个输出缓冲器可用)。我使用的Andr​​oid API 19日,所以我不能使用异步媒体codeC处理方法。实际上,我从谷歌Project Tango平板电脑流式传输图像,所以我的其他怀疑的是,也许是探戈的后台操作的时间太长而造成的EN codeR缓慢。什么可能会放缓下来有什么想法这么多?

  01-20 23:36:30.728 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了0.400666ms。
01-20 23:36:30.855 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了94.290667ms。
01-20 23:36:30.880 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了0.57ms。
01-20 23:36:30.929 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了4.878417ms。
01-20 23:36:31.042 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了77.495417ms。
01-20 23:36:31.064 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了0.3225ms。
01-20 23:36:31.182 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了74.777583ms。
01-20 23:36:31.195 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了0.23ms。
01-20 23:36:31.246 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了17.243583ms。
01-20 23:36:31.350 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了80.14725ms。
01-20 23:36:31.373 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了2.493834ms。
01-20 23:36:31.421 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了13.273ms。
01-20 23:36:31.546 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了93.543667ms。
01-20 23:36:31.576 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了5.309334ms。
01-20 23:36:31.619 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了13.402583ms。
01-20 23:36:31.686 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了22.5485ms。
01-20 23:36:31.809 2920-3014 / COM .... D / StreamingThread:dequeueOutputBuffer了91.392083ms。

我的相关code是如下:

 公共类StreamingThread继承Thread {
    ...    //编码
    私营媒体codeC mVideoEn codeR = NULL;
    私人的ByteBuffer []男人coderInputBuffers = NULL;
    私人的ByteBuffer []男人coderOutputBuffers = NULL;
    私人NV21Convertor mNV21Converter = NULL;    公共静态本地VideoFrame getNewFrame();    公共StreamingThread()
    {
        this.setPriority(MAX_PRIORITY);
    }    @覆盖
    公共无效的run()
    {
        尺蠖prepare()。
        在里面();
        Looper.loop();
    }    私人无效的init()
    {
        mHandler =新的处理程序(){
            公共无效的handleMessage(消息MSG){
                //处理传入的消息在这里
                开关(msg.what)
                {
                    案例HAVE_NEW_FRAME://新帧已到达(从主线程信号)
                        processBufferedFrames();
                        打破;                    案例CLOSE_THREAD:
                        关();
                        打破;                    默认:
                        Log.e(LOGTAG,收到不明的消息!);
                }
            }
        };        尝试{
            ...
            //设置视频编码
            最后弦乐哑剧=视频/ AVC // H.264 / AVC
            listAvailableEn codeRS(MIME); //(这只是带来了一些调试输出)
            字符串codeC =OMX.Nvidia.h264.en codeR //相反,硬code,我们要使用现在codeC            mVideoEn codeR =媒体codec.createBy codecName(codeC);
            如果(mVideoEn codeR == NULL)
                Log.e(LOGTAG,媒体codeC+ codeC +不可!);            // TODO:变化的基础上,我们要什么流?
            INT FRAME_WIDTH = 1280;
            INT FRAME_HEIGHT = 720;            // https://github.com/fyhertz/libstreaming/blob/ac44416d88ed3112869ef0f7eab151a184bbb78d/src/net/majorkernelpanic/streaming/hw/En$c$crDebugger.java
            mNV21Converter =新NV21Convertor();
            mNV21Converter.setSize(FRAME_WIDTH,FRAME_HEIGHT);
            mNV21Converter.setEn coderColorFormat(媒体codecInfo codecCapabilities.COLOR_FormatYUV420Planar);
            mNV21Converter.setColorPanesReversed(真);
            mNV21Converter.setYPadding(0);            MediaFormat格式= MediaFormat.createVideoFormat(哑剧,FRAME_WIDTH,FRAME_HEIGHT);
            format.setInteger(MediaFormat.KEY_FRAME_RATE,25);
            format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,10);
            format.setInteger(MediaFormat.KEY_COLOR_FORMAT,媒体codecInfo codecCapabilities.COLOR_FormatYUV420Planar);
            // TODO:优化比特率
            format.setInteger(MediaFormat.KEY_BIT_RATE,250000); // 400万比特/秒= 0.48兆字节/秒            mVideoEn coder.configure(格式,NULL,NULL,媒体codec.CONFIGURE_FLAG_EN code);
            mVideoEn coder.start();
            男人coderInputBuffers = mVideoEn coder.getInputBuffers();
            男人coderOutputBuffers = mVideoEn coder.getOutputBuffers();            Log.d(LOGTAG,输入数缓冲区+男人coderInputBuffers.length);
            Log.d(LOGTAG,输出缓冲器的数量+男人coderOutputBuffers.length);            初始化= TRUE;        }赶上(例外五){
            e.printStackTrace();
        }
    }    私人无效的close()
    {
        Looper.myLooper()退出()。
        mVideoEn coder.stop();
        mVideoEn coder.release();
        mVideoEn codeR = NULL;
    }    私人无效processBufferedFrames()
    {
        如果(!初始化)
            返回;
        VideoFrame帧= getNewFrame();        尝试{
            sendTCPFrame(架);        }赶上(例外五){
            e.printStackTrace();
        }
    }
    私人无效sendTCPFrame(VideoFrame帧)
    {
        长启动= System.nanoTime();        长START2 = System.nanoTime();
        INT inputBufferIndex = -1;
        而((inputBufferIndex = mVideoEn coder.dequeueInputBuffer(-1))小于0){// -1:无限期地等待缓冲
            开关(inputBufferIndex){
                默认:
                    Log.e(LOGTAG,dequeueInputBuffer返回未知值:+ inputBufferIndex);
            }
        }
        //填写输入(原始)数据:
        男人coderInputBuffers [inputBufferIndex] .clear();        长期停止2 = System.nanoTime();
        Log.d(LOGTAGdequeueInputBuffer了+(停止2 - START2)/ 1e6个电子+毫秒);        START2 = System.nanoTime();
        字节[] =像素mNV21Converter.convert(frame.pixels);
        停止2 = System.nanoTime();
        Log.d(LOGTAGmNV21Converter.convert了+(停止2-START2)/ 1e6个电子+毫秒);        START2 = System.nanoTime();
        男人coderInputBuffers [inputBufferIndex]。把(像素);
        停止2 = System.nanoTime();
        Log.d(LOGTAG,男人coderInputBuffers [inputBufferIndex]。把(像素)带+(停止2 - START2)/ 1e6个电子+毫秒);        START2 = System.nanoTime();
        //mVideoEn$c$cr.queueInputBuffer(inputBufferIndex,0,pixels.length,0,0);
        //mVideoEn$c$cr.queueInputBuffer(inputBufferIndex,0,pixels.length,System.nanoTime()/ 1000,0);
        mVideoEn coder.queueInputBuffer(inputBufferIndex,0,pixels.length,System.nanoTime(),0);
        停止2 = System.nanoTime();
        Log.d(LOGTAGqueueInputBuffer了+(停止2 - START2)/ 1e6个电子+毫秒);        START2 = System.nanoTime();
        //等待连接codeD数据变为可用:
        INT outputBufferIndex = -1;
        媒体codec.BufferInfo bufInfo =新媒体codec.BufferInfo();
        长timeoutUs = -1; // 10000; //微秒
        而((outputBufferIndex = mVideoEn coder.dequeueOutputBuffer(bufInfo,timeoutUs))小于0){// -1:无限期地等待缓冲
            Log.i(LOGTAG,dequeueOutputBuffer返回值:+ outputBufferIndex);
            开关(outputBufferIndex){
                案例媒体codec.INFO_OUTPUT_BUFFERS_CHANGED:
                    //输出缓冲器已经改变,移动参考
                    男人coderOutputBuffers = mVideoEn coder.getOutputBuffers();
                    打破;
                案例媒体codec.INFO_OUTPUT_FORMAT_CHANGED:
                    //随后的数据将符合​​新的格式。
                    // MediaFormat格式= codec.getOutputFormat();
                    Log.e(LOGTAG,dequeueOutputBuffer返回INFO_OUTPUT_FORMAT_CHANGED?!);
                    打破;
                案例媒体codec.INFO_TRY_AGAIN_LATER:
                    Log.w(LOGTAGdequeueOutputBuffer返回INFO_TRY_AGAIN_LATER);
                    打破;
                默认:
                    Log.e(LOGTAG,dequeueOutputBuffer返回未知值:+ outputBufferIndex);
            }
        }
        停止2 = System.nanoTime();
        Log.d(LOGTAGdequeueOutputBuffer了+(停止2 - START2)/ 1e6个电子+毫秒);        //输出(EN codeD)提供的数据!
        Log.d(LOGTAG,恩codeD缓冲区信息:大小=+ bufInfo.size +,偏移量=+ bufInfo.offset +,presentationTimeUs =+ bufInfo presentationTimeUs + ,旗帜=+ bufInfo.flags);
        ByteBuffer的EN codedData =男人coderOutputBuffers [outputBufferIndex]
        最终诠释sizeOfImageData = bufInfo.size;        多头止损= System.nanoTime();
        Log.d(LOGTAG,编码图像了+(走走停停)/ 1e6个电子+毫秒);        开始= System.nanoTime();
        //组装标头:
    ...        EN codedData.rewind();
        //拷贝原始图像数据为直接(阵列支持)缓存(!):
        ByteBuffer的ImageBuffer的= ByteBuffer.allocateDirect(EN codedData.remaining());
        imageBuffer.put(EN codedData); // TODO:可这个副本是可以避免的?        停止= System.nanoTime();
        Log.d(LOGTAG,流媒体preparing内容采取+(停止 - 起动)/ 1e6个电子+毫秒);
        //通过TCP做流
        ...
        mVideoEn coder.releaseOutputBuffer(outputBufferIndex,FALSE);
    }    //看到http://developer.android.com/reference/android/media/Media$c$ccInfo.html
    私人无效listAvailableEn codeRS(字符串mime类型)
    {
        Log.d(LOGTAG,可用的连接codeRS的MIME类型为+ mime类型+:);
        的for(int i = 0; I<媒体codecList.get codecCount();我++){
            媒体codecInfo codeC =媒体codecList.get codecInfoAt(I)            如果(!codec.isEn codeR())
                继续;            的String []类型= codec.getSupportedTypes();
            对于(INT J = 0; J< types.length; J ++){
                //如果(类型[J] .equalsIgnoreCase(mime类型)){
                弦乐味精= - 名称:+ codec.getName()+,支持的色彩格式+ mime类型+:
                。媒体codecInfo codecCapabilities上限= codec.getCapabilitiesForType(mime类型);
                对于(INT K = 0; K< cap.colorFormats.length ++ k)的味精=味精++ cap.colorFormats [K];
                Log.d(LOGTAG,味精);
                //突破;
                //}
            }
        }
    }


解决方案

是的,有什么不对您的code - 你是同步等待当前帧从在继续之前的EN codeR输出与下一帧。大多数硬件codeCS比你所期望的更多的延迟​​,并且为了得到正确的吞吐量为EN codeR是有能力,你需要以异步方式使用它。

也就是说,发送编码一个输入缓冲区之后,你不应该等待EN codeD输出缓冲区,但如果有输出只检查。那么你应该去上输入下一个缓冲区,并再次检查任何可用的输出。只有当你没有立即得到一个输入缓冲器,就可以开始等待输出。这样一来,总有一个以上的输入缓冲区可用于连接codeR开始工作,以保持它忙真正实现帧速率,这是可以胜任的。

(如果你都OK与需要的是Android 5.0,你可以看看媒体codec.setCallback ,这使得它更容易与异步工作。)

甚至有一些codeCS(主要是德codeRS不过,如果我没有记错正确地)甚至不会输出的第一个缓冲区,直到你通过以上几个输入缓冲器以上。

I'm trying to encode h264 video on Android for real-time video streaming using MediaCodec but dequeueOutputBuffer keeps taking very long (actually it's very fast sometimes but very slow at other times, see log output below). I've seen it go even up to 200ms for the output buffer to be ready. Is there something I'm doing wrong with my code or do you think this is an issue with the OMX.Nvidia.h264.encoder?

Maybe I need to downsample the image from 1280x720 to something smaller? Or maybe I need to dequeue and queue more input buffers while I'm waiting for the output buffer? (There are 6 input and 6 output buffers available). I'm using Android API 19, so I can't use the asynchronous MediaCodec processing method. I'm actually streaming an image from a Google Project Tango tablet, so my other suspicion is that perhaps the Tango's background operations are taking too long and causing the encoder to be slow. Any thoughts on what might be slowing this down so much?

01-20 23:36:30.728 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 0.400666ms.
01-20 23:36:30.855 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 94.290667ms.
01-20 23:36:30.880 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 0.57ms.
01-20 23:36:30.929 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 4.878417ms.
01-20 23:36:31.042 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 77.495417ms.
01-20 23:36:31.064 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 0.3225ms.
01-20 23:36:31.182 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 74.777583ms.
01-20 23:36:31.195 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 0.23ms.
01-20 23:36:31.246 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 17.243583ms.
01-20 23:36:31.350 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 80.14725ms.
01-20 23:36:31.373 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 2.493834ms.
01-20 23:36:31.421 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 13.273ms.
01-20 23:36:31.546 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 93.543667ms.
01-20 23:36:31.576 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 5.309334ms.
01-20 23:36:31.619 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 13.402583ms.
01-20 23:36:31.686 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 22.5485ms.
01-20 23:36:31.809 2920-3014/com.... D/StreamingThread: dequeueOutputBuffer took 91.392083ms.

My relevant code is as follows:

public class StreamingThread extends Thread {
    ...

    // encoding
    private MediaCodec mVideoEncoder = null;
    private ByteBuffer[] mEncoderInputBuffers = null;
    private ByteBuffer[] mEncoderOutputBuffers = null;
    private NV21Convertor mNV21Converter = null;

    public static native VideoFrame getNewFrame();

    public StreamingThread()
    {
        this.setPriority(MAX_PRIORITY);
    }

    @Override
    public void run()
    {
        Looper.prepare();
        init();
        Looper.loop();
    }

    private void init()
    {
        mHandler = new Handler() {
            public void handleMessage(Message msg) {
                // process incoming messages here
                switch(msg.what)
                {
                    case HAVE_NEW_FRAME: // new frame has arrived (signaled from main thread)
                        processBufferedFrames();
                        break;

                    case CLOSE_THREAD:
                        close();
                        break;

                    default:
                        Log.e(LOGTAG, "received unknown message!");
                }
            }
        };

        try {
            ...
            // set up video encoding
            final String mime = "video/avc"; // H.264/AVC
            listAvailableEncoders(mime); // (this creates some debug output only)
            String codec = "OMX.Nvidia.h264.encoder"; // instead, hard-code the codec we want to use for now

            mVideoEncoder = MediaCodec.createByCodecName(codec);
            if(mVideoEncoder == null)
                Log.e(LOGTAG, "Media codec " + codec + " is not available!");

            // TODO: change, based on what we're streaming...
            int FRAME_WIDTH = 1280;
            int FRAME_HEIGHT = 720;

            // https://github.com/fyhertz/libstreaming/blob/ac44416d88ed3112869ef0f7eab151a184bbb78d/src/net/majorkernelpanic/streaming/hw/EncoderDebugger.java
            mNV21Converter = new NV21Convertor();
            mNV21Converter.setSize(FRAME_WIDTH, FRAME_HEIGHT);
            mNV21Converter.setEncoderColorFormat(MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
            mNV21Converter.setColorPanesReversed(true);
            mNV21Converter.setYPadding(0);

            MediaFormat format = MediaFormat.createVideoFormat(mime, FRAME_WIDTH, FRAME_HEIGHT);
            format.setInteger(MediaFormat.KEY_FRAME_RATE, 25);
            format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
            format.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
            // TODO: optimize bit rate
            format.setInteger(MediaFormat.KEY_BIT_RATE, 250000); // 4 Million bits/second = 0.48 Megabytes/s

            mVideoEncoder.configure(format, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
            mVideoEncoder.start();
            mEncoderInputBuffers  = mVideoEncoder.getInputBuffers();
            mEncoderOutputBuffers = mVideoEncoder.getOutputBuffers();

            Log.d(LOGTAG, "Number of input buffers " + mEncoderInputBuffers.length);
            Log.d(LOGTAG, "Number of output buffers " + mEncoderOutputBuffers.length);

            initialized = true;

        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    private void close()
    {
        Looper.myLooper().quit();
        mVideoEncoder.stop();
        mVideoEncoder.release();
        mVideoEncoder = null;
    }

    private void processBufferedFrames()
    {
        if (!initialized)
            return;
        VideoFrame frame = getNewFrame();

        try {
            sendTCPFrame(frame);

        } catch (Exception e) {
            e.printStackTrace();
        }
    }
    private void sendTCPFrame(VideoFrame frame)
    {
        long start = System.nanoTime();

        long start2 = System.nanoTime();
        int inputBufferIndex = -1;
        while((inputBufferIndex = mVideoEncoder.dequeueInputBuffer(-1)) < 0 ) { // -1: wait indefinitely for the buffer
            switch(inputBufferIndex) {
                default:
                    Log.e(LOGTAG, "dequeueInputBuffer returned unknown value: " + inputBufferIndex);
            }
        }
        // fill in input (raw) data:
        mEncoderInputBuffers[inputBufferIndex].clear();

        long stop2 = System.nanoTime();
        Log.d(LOGTAG, "dequeueInputBuffer took " + (stop2 - start2) / 1e6 + "ms.");

        start2 = System.nanoTime();
        byte[] pixels = mNV21Converter.convert(frame.pixels);
        stop2 = System.nanoTime();
        Log.d(LOGTAG, "mNV21Converter.convert took " + (stop2-start2)/1e6 + "ms.");

        start2 = System.nanoTime();
        mEncoderInputBuffers[inputBufferIndex].put(pixels);
        stop2 = System.nanoTime();
        Log.d(LOGTAG, "mEncoderInputBuffers[inputBufferIndex].put(pixels) took " + (stop2 - start2) / 1e6 + "ms.");

        start2 = System.nanoTime();
        //mVideoEncoder.queueInputBuffer(inputBufferIndex, 0, pixels.length, 0, 0);
        //mVideoEncoder.queueInputBuffer(inputBufferIndex, 0, pixels.length, System.nanoTime() / 1000, 0);
        mVideoEncoder.queueInputBuffer(inputBufferIndex, 0, pixels.length, System.nanoTime(), 0);
        stop2 = System.nanoTime();
        Log.d(LOGTAG, "queueInputBuffer took " + (stop2 - start2) / 1e6 + "ms.");

        start2 = System.nanoTime();
        // wait for encoded data to become available:
        int outputBufferIndex = -1;
        MediaCodec.BufferInfo bufInfo = new MediaCodec.BufferInfo();
        long timeoutUs = -1;//10000; // microseconds
        while((outputBufferIndex = mVideoEncoder.dequeueOutputBuffer(bufInfo, timeoutUs)) < 0 ) { // -1: wait indefinitely for the buffer
            Log.i(LOGTAG, "dequeueOutputBuffer returned value: " + outputBufferIndex);
            switch(outputBufferIndex) {
                case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                    // output buffers have changed, move reference
                    mEncoderOutputBuffers = mVideoEncoder.getOutputBuffers();
                    break;
                case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                    // Subsequent data will conform to new format.
                    //MediaFormat format = codec.getOutputFormat();
                    Log.e(LOGTAG, "dequeueOutputBuffer returned INFO_OUTPUT_FORMAT_CHANGED ?!");
                    break;
                case MediaCodec.INFO_TRY_AGAIN_LATER:
                    Log.w(LOGTAG, "dequeueOutputBuffer return INFO_TRY_AGAIN_LATER");
                    break;
                default:
                    Log.e(LOGTAG, "dequeueOutputBuffer returned unknown value: " + outputBufferIndex);
            }
        }
        stop2 = System.nanoTime();
        Log.d(LOGTAG, "dequeueOutputBuffer took " + (stop2 - start2) / 1e6 + "ms.");

        // output (encoded) data available!
        Log.d(LOGTAG, "encoded buffer info: size = " + bufInfo.size + ", offset = " + bufInfo.offset + ", presentationTimeUs = " + bufInfo.presentationTimeUs + ", flags = " + bufInfo.flags);
        ByteBuffer encodedData = mEncoderOutputBuffers[outputBufferIndex];
        final int sizeOfImageData = bufInfo.size;

        long stop = System.nanoTime();
        Log.d(LOGTAG, "Encoding image took " + (stop-start)/1e6 + "ms.");

        start = System.nanoTime();
        // assemble header:
    ... 

        encodedData.rewind();
        // copy (!) raw image data to "direct" (array-backed) buffer:
        ByteBuffer imageBuffer = ByteBuffer.allocateDirect(encodedData.remaining());
        imageBuffer.put(encodedData); // TODO: can this copy be avoided?

        stop = System.nanoTime();
        Log.d(LOGTAG, "Preparing content for streaming took " + (stop - start) / 1e6 + "ms.");
        // do streaming via TCP
        ...
        mVideoEncoder.releaseOutputBuffer(outputBufferIndex, false);
    }

    // see http://developer.android.com/reference/android/media/MediaCodecInfo.html
    private void listAvailableEncoders(String mimeType)
    {
        Log.d(LOGTAG, "Available encoders for mime type " + mimeType + ":");
        for (int i = 0; i < MediaCodecList.getCodecCount(); i++) {
            MediaCodecInfo codec = MediaCodecList.getCodecInfoAt(i);

            if (!codec.isEncoder())
                continue;

            String[] types = codec.getSupportedTypes();
            for (int j = 0; j < types.length; j++) {
                //if (types[j].equalsIgnoreCase(mimeType)) {
                String msg = "- name: " + codec.getName() + ", supported color formats for " + mimeType + ":";
                MediaCodecInfo.CodecCapabilities cap = codec.getCapabilitiesForType(mimeType);
                for(int k = 0; k < cap.colorFormats.length; ++k) msg = msg + " " + cap.colorFormats[k];
                Log.d(LOGTAG, msg);
                //  break;
                //}
            }
        }
    }

解决方案

Yes, there is something wrong with your code - you are waiting synchronously for the current frame to be output from the encoder before proceeding with the next frame. Most hardware codecs have a bit more latency than you would expect, and in order to get the proper throughput as the encoder is capable of, you need to use it asynchronously.

That is, after sending one input buffer for encoding, you should not wait for the encoded output buffer, but only check if there is output. You should then go on and input the next buffer, and again check for any available output. Only once you don't get an input buffer immediately, you can start waiting for output. This way, there's always more than one input buffer available for the encoder to start working on, to keep it busy to actually achieve the frame rate that it is capable of.

(If you are ok with requiring Android 5.0, you could take a look at MediaCodec.setCallback, which makes it easier to work with asynchronously.)

There are even some codecs (mainly decoders though, if my memory serves me correctly) that won't even output the first buffer until you have passed more than a few input buffers.

这篇关于媒体codec.dequeueOutputBuffer走很长的Andr​​oid上的H264编码时的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆