MediaCodec H264 编码器不适用于 Snapdragon 800 设备 [英] MediaCodec H264 Encoder not working on Snapdragon 800 devices

查看:44
本文介绍了MediaCodec H264 编码器不适用于 Snapdragon 800 设备的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用 Android 的 MediaCodec API 编写了一个 H264 流编码器.我在大约十种不同处理器的不同设备上对其进行了测试,除搭载 Snapdragon 800 的设备(谷歌 Nexus 5 和索尼 Xperia Z1)外,所有设备都可以正常工作.在这些设备上,我获得了 SPS 和 PPS 以及第一个关键帧,但在那之后 mEncoder.dequeueOutputBuffer(mBufferInfo, 0) 只返回 MediaCodec.INFO_TRY_AGAIN_LATER.我已经尝试过不同的超时、比特率、分辨率和其他配置选项,但无济于事.结果总是一样的.

I have written a H264 Stream Encoder using the MediaCodec API of Android. I tested it on about ten different devices with different processors and it worked on all of them, except on Snapdragon 800 powered ones (Google Nexus 5 and Sony Xperia Z1). On those devices I get the SPS and PPS and the first Keyframe, but after that mEncoder.dequeueOutputBuffer(mBufferInfo, 0) only returns MediaCodec.INFO_TRY_AGAIN_LATER. I already experimented with different timeouts, bitrates, resolutions and other configuration options, to no avail. The result is always the same.

我使用以下代码来初始化编码器:

I use the following code to initialise the Encoder:

        mBufferInfo = new MediaCodec.BufferInfo();
        encoder = MediaCodec.createEncoderByType("video/avc");
        MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 640, 480);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 768000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, mEncoderColorFormat);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
        encoder.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

所选颜色格式的位置:

MediaCodecInfo.CodecCapabilities capabilities = mCodecInfo.getCapabilitiesForType(MIME_TYPE);
            for (int i = 0; i < capabilities.colorFormats.length && selectedColorFormat == 0; i++)
            {
                int format = capabilities.colorFormats[i];
                switch (format) {
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_QCOM_FormatYUV420SemiPlanar:
                        selectedColorFormat = format;
                        break;
                    default:
                        LogHandler.e(LOG_TAG, "Unsupported color format " + format);
                        break;
                }
            }

我通过做得到数据

            ByteBuffer[] inputBuffers = mEncoder.getInputBuffers();
        ByteBuffer[] outputBuffers = mEncoder.getOutputBuffers();

        int inputBufferIndex = mEncoder.dequeueInputBuffer(-1);
        if (inputBufferIndex >= 0)
        {
            // fill inputBuffers[inputBufferIndex] with valid data
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(rawFrame);
            mEncoder.queueInputBuffer(inputBufferIndex, 0, rawFrame.length, 0, 0);
            LogHandler.e(LOG_TAG, "Queue Buffer in " + inputBufferIndex);
        }

        while(true)
        {
            int outputBufferIndex = mEncoder.dequeueOutputBuffer(mBufferInfo, 0);
            if (outputBufferIndex >= 0)
            {
                Log.d(LOG_TAG, "Queue Buffer out " + outputBufferIndex);
                ByteBuffer buffer = outputBuffers[outputBufferIndex];
                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0)
                {
                    // Config Bytes means SPS and PPS
                    Log.d(LOG_TAG, "Got config bytes");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0)
                {
                    // Marks a Keyframe
                    Log.d(LOG_TAG, "Got Sync Frame");
                }

                if (mBufferInfo.size != 0)
                {
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    buffer.position(mBufferInfo.offset);
                    buffer.limit(mBufferInfo.offset + mBufferInfo.size);

                    int nalUnitLength = 0;
                    while((nalUnitLength = parseNextNalUnit(buffer)) != 0)
                    {
                        switch(mVideoData[0] & 0x0f)
                        {
                            // SPS
                            case 0x07:
                            {
                                Log.d(LOG_TAG, "Got SPS");
                                break;
                            }

                            // PPS
                            case 0x08:
                            {
                                Log.d(LOG_TAG, "Got PPS");
                                break;
                            }

                            // Key Frame
                            case 0x05:
                            {
                                Log.d(LOG_TAG, "Got Keyframe");
                            }

                            //$FALL-THROUGH$
                            default:
                            {
                                // Process Data
                                break;
                            }
                        }
                    }
                }

                mEncoder.releaseOutputBuffer(outputBufferIndex, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0)
                {
                    // Stream is marked as done,
                    // break out of while
                    Log.d(LOG_TAG, "Marked EOS");
                    break;
                }
            }
            else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
            {
                outputBuffers = mEncoder.getOutputBuffers();
                Log.d(LOG_TAG, "Output Buffer changed " + outputBuffers);
            }
            else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
            {
                MediaFormat newFormat = mEncoder.getOutputFormat();
                Log.d(LOG_TAG, "Media Format Changed " + newFormat);
            }
            else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER)
            {
                // No Data, break out
                break;
            }
            else
            {
                // Unexpected State, ignore it
                Log.d(LOG_TAG, "Unexpected State " + outputBufferIndex);
            }
        }

感谢您的帮助!

推荐答案

您需要在调用 queueInputBuffer 时设置presentationTimeUs 参数.大多数编码器会忽略这一点,您可以毫无问题地对流进行编码.用于 Snapdragon 800 设备的编码器没有.

You need to set the presentationTimeUs parameter in your call to queueInputBuffer. Most encoders ignore this and you can encode for streaming without issues. The encoder used for Snapdragon 800 devices doesn't.

此参数代表您的帧的录制时间,因此需要在您要编码的帧和前一帧之间增加我们的数量.

This parameter represents the recording time of your frame and needs therefore to increase by the number of us between the frame that you want to encode and the previous frame.

如果参数集与前一帧中的值相同,则编码器将其丢弃.如果参数设置的太小(例如 100000 以 30 FPS 录制),编码帧的质量会下降.

If the parameter set is the same value as in the previous frame the encoder drops it. If the parameter is set to a too small value (e.g. 100000 on a 30 FPS recording) the quality of the encoded frames drops.

这篇关于MediaCodec H264 编码器不适用于 Snapdragon 800 设备的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆