视频渲染坏了媒体codeC H.264码流 [英] Video rendering is broken MediaCodec H.264 stream

查看:416
本文介绍了视频渲染坏了媒体codeC H.264码流的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我实现了实时H.264远程流进行解码解codeR使用媒体codeC 的Java API。我从原生层用回调接收H.264连接codeD数据(无效OnRecvEn codedData(byte []的EN codedData)),德code和渲染对表面 TextureView 的。我实施完成后(检索连接使用回调codeD流,德code和渲染等)。这是我去codeR类:

I am implementing a decoder using MediaCodec Java API for decoding live H.264 remote stream. I am receiving H.264 encoded data from native layer using a callback (void OnRecvEncodedData(byte[] encodedData)), decode and render on Surface of TextureView. My implementation is completed (retrieving encoded streams using callback, decode and rendering etc). Here is my decoder class:

public class MediaCodecDecoder extends Thread implements MyFrameAvailableListener {

    private static final boolean VERBOSE = true;
    private static final String LOG_TAG = MediaCodecDecoder.class.getSimpleName();
    private static final String VIDEO_FORMAT = "video/avc"; // h.264
    private static final long mTimeoutUs = 10000l;

    private MediaCodec mMediaCodec;
    Surface mSurface;
    volatile boolean m_bConfigured;
    volatile boolean m_bRunning;
    long startMs;

    public MediaCodecDecoder() {
        JniWrapper.SetFrameAvailableListener(this);
    }

    // this is my callback where I am receiving encoded streams from native layer 
    @Override
    public void OnRecvEncodedData(byte[] encodedData) {
        if(!m_bConfigured && bKeyFrame(encodedData)) {
            Configure(mSurface, 240, 320, encodedData);
        }
        if(m_bConfigured) {
            decodeData(encodedData);
        }
    }

    public void SetSurface(Surface surface) {
        if (mSurface == null) {
            mSurface = surface;
        }
    }

    public void Start() {
        if(m_bRunning)
            return;
        m_bRunning = true;
        start();
    }

    public void Stop() {
        if(!m_bRunning)
            return;
        m_bRunning = false;
        mMediaCodec.stop();
        mMediaCodec.release();
    }

    private void Configure(Surface surface, int width, int height, byte[] csd0) {
        if (m_bConfigured) {
            Log.e(LOG_TAG, "Decoder is already configured");
            return;
        }
        if (mSurface == null) {
            Log.d(LOG_TAG, "Surface is not available/set yet.");
            return;
        }
        MediaFormat format = MediaFormat.createVideoFormat(VIDEO_FORMAT, width, height);
        format.setByteBuffer("csd-0", ByteBuffer.wrap(csd0));
        try {
            mMediaCodec = MediaCodec.createDecoderByType(VIDEO_FORMAT);
        } catch (IOException e) {
            Log.d(LOG_TAG, "Failed to create codec: " + e.getMessage());
        }

        startMs = System.currentTimeMillis();
        mMediaCodec.configure(format, surface, null, 0);
        if (VERBOSE) Log.d(LOG_TAG, "Decoder configured.");

        mMediaCodec.start();
        Log.d(LOG_TAG, "Decoder initialized.");

        m_bConfigured = true;
    }

    @SuppressWarnings("deprecation")
    private void decodeData(byte[] data) {
        if (!m_bConfigured) {
            Log.e(LOG_TAG, "Decoder is not configured yet.");
            return;
        }
        int inIndex = mMediaCodec.dequeueInputBuffer(mTimeoutUs);
        if (inIndex >= 0) {
            ByteBuffer buffer;
            if (Build.VERSION.SDK_INT < Build.VERSION_CODES.LOLLIPOP) {
                buffer = mMediaCodec.getInputBuffers()[inIndex];
                buffer.clear();
            } else {
                buffer = mMediaCodec.getInputBuffer(inIndex);
            }
            if (buffer != null) {
                buffer.put(data);
                long presentationTimeUs = System.currentTimeMillis() - startMs;
                mMediaCodec.queueInputBuffer(inIndex, 0, data.length, presentationTimeUs, 0);
            }
        }
    }

    private static boolean bKeyFrame(byte[] frameData) {
        return ( ( (frameData[4] & 0xFF) & 0x0F) == 0x07);
    }

    @Override
    public void run() {
        try {
            MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();
            while(m_bRunning) {
                if(m_bConfigured) {
                    int outIndex = mMediaCodec.dequeueOutputBuffer(info, mTimeoutUs);
                    if(outIndex >= 0) {
                        mMediaCodec.releaseOutputBuffer(outIndex, true);
                    }
                } else {
                    try {
                        Thread.sleep(10);
                    } catch (InterruptedException ignore) {
                    }
                }
            }
        } finally {
            Stop();
        }
    }
}

现在的问题是 - 流正在去codeD和表面渲染,但视频不清晰。这似乎是帧被打破,场面失真/脏。该运动被破坏,方形的碎片随处可见(我真的很抱歉,因为我没有截图现在)。

Now the problem is - the streams is being decoded and rendered on surface but the video is not clear. It seems like the frames are broken and scene is distorted/dirty. The movement is broken and square shaped fragments everywhere (I am really sorry as I don't have the screenshot right now).

关于我流 - 它的H.264连接codeD,由I帧的只有P帧(没有B帧)。每一个I帧都有 SPS + PPS +有效载荷结构。编码(原生层使用FFMPEG)过程中使用的颜色格式是YUV420规划师。从原生层数据的发送长度是好的(宽*高*(3/2))。

About my streams - its H.264 encoded and consists of I frames and P frames only (there is no B frame). Every I frame has SPS + PPS + payload structure. The color format used during encoding (using FFMPEG in native layer) is YUV420 planner. The sent length of data from native layer is okay (width * height * (3 / 2)).

配置()我只是设置了 CSD-0 价值与SPS框架。用于配置的框架是一个I帧(SPS + PPS +有效载荷) - 这是preFIX是SPS框架,所以我觉得配置成功。需要注意的是,我并没有设置 CSD-1 价值与PPS帧(它是一个问题吗?)。

During configure() I just set the csd-0 value with SPS frame. The frame used for configuration was an I frame (SPS + PPS + payload) - the prefix was a SPS frame, so I think the configuration was successful. Note that, I didn't set the csd-1 value with PPS frame (is it a problem?).

每一帧都有preceding启动codeS( 0×00 0×00 0×00 0×01 )为P帧和I帧(用于I帧的开始code是present SPS和PPS帧)。两盈

Every frame has preceding start codes (0x00 0x00 0x00 0x01) for both p-frame and I-frame (for I-frame the start code is present both infront of SPS and PPS frame).

另外,我设置了presentation时间戳 System.currrentTimeMillis() - 为每一个被越来越多为了让每一个新的帧的帧的startTime 。我想这应该不会引起任何问题(纠正我,如果我错了)。

Moreover, I am setting the presentation timestamp as System.currrentTimeMillis() - startTime for every frame which is increasing order for every new frame. I think this shouldn't cause any problem (Correct me if I am wrong).

我的设备的Nexus 5,从谷歌与Android版本4.4.4和芯片组是高通MSM8974 Snapdragon的800,我使用表面解码,所以我觉得不应该是任何特定的设备色彩格式失配的问题。

My device is Nexus 5 from Google with Android version 4.4.4 and chipset is Qualcomm MSM8974 Snapdragon 800. I am using Surface for decoding, so I think there should not be any device specific color format mismatch issues.

我还可以提供我的 TextureView code。如果需要的。

I can also provide my TextureView code if needed.

什么可能是我的不正确解码/渲染的原因是什么?在此先感谢!

What might be the cause of my incorrect decoding/rendering? Thanks in advance!

修改1

我试图在配置过程中手动传递我的codeC特有的数据(SPS和PPS字节)。但这并没有做任何改变:(

I tried by manually passing my codec-specific data(SPS and PPS bytes) during configuration. But this didn't make any change :(

byte[] sps  = {0x00, 0x00, 0x00, 0x01, 0x67, 0x4d, 0x40, 0x0c, (byte) 0xda, 0x0f, 0x0a, 0x68, 0x40, 0x00, 0x00, 0x03, 0x00, 0x40, 0x00, 0x00, 0x07, (byte) 0xa3, (byte) 0xc5, 0x0a, (byte) 0xa8};
format.setByteBuffer("csd-0", ByteBuffer.wrap(sps));

byte[] pps = {0x00, 0x00, 0x00, 0x01, 0x68, (byte) 0xef, 0x04, (byte) 0xf2, 0x00, 0x00};
format.setByteBuffer("csd-1", ByteBuffer.wrap(pps));

我还试图通过调整开始codeS( 0×00,0×00,0×00,0×01 ),但没有任何进展!

I also tried by trimming the start codes (0x00, 0x00, 0x00, 0x01) but no progress!

编辑2

我试着用硬件加速{{TextureView}},因为它提到官方文档(虽然我没有发现任何H / W加速code在媒体codeC-textureView的样本项目)。但仍然没有进展。现在我评论的H / W加速code段。

I tried with hardware accelerated {{TextureView}} as it is mentioned in official documentation (though I didn't find any H/W acceleration code in sample project of MediaCodec-textureView). But still no progress. Now I commented the H/W acceleration code snippet.

修改3

截图现已即时拍摄:

断代codeD

修改4

有关进一步的澄清,这是我的H.264连接codeD I帧十六进制流格式:

For further clarification, this is my H.264 encoded I-frame hex stream format:

00 00 00 01 67 4D 40 0​​C达0F 0A 68 40 0​​0 00 03 00 40 0​​0 00 07 A3 C5   A8 0A 00 00 00 01 68 04 EF F2 00 00 01 06 05 FF FF 69 DC 45 E9 BD E6   D9 48 B7 96 2C D8 20 D9 23 EE EF 78 32 36 34 20 20二维6F 63 72 65 20   31 34 36 20 20 2D 48 2E 32 36 34 2F 4D 50 45 47 34 2D 20 41 56 43 20   63 6F 64 65 63 20 20 2D 43 6F 70 79 6C 65 66 74 20 32 30 30 33 32 2D   30 31 35 20 20 2D 68 74 74 70 3A 2F 2F 77 77 77 2E 76 69 64 65 6F 6C   61 6E 2E 6F 72 67 2F 78 32 36 34 2E 68 74 6D 6C 20二维20 6F 70 74 69   6F 6E 73 20 3A 63 61 62 61 63 3D 31 20 72 65 66 3D 31 20 64 65 62 6C   6F 63 6B 3D 31 3A 30 3A 30 20 61 6E 61 6C 79 73 65 3D 30 78 31 3A 30   78 31 20 6D 65 3D 68 65 78 20 73 75 62 6D 65 3D 30 20 70 73 79 3D 31   20 70 73 79 5F 72 64 3D 31 2E 30 30 3A 30 2E 30 30 20 6D 69 78 65 64   5F 72 65 66 3D 30 20 6D 65 5F 72 61 6E 67 65 3D 31 36 20 63 68 72 1207米   6D 61 5F 6D 65 3D 31 20 74 72 65 6C 6C 69 73 3D 30 20 38 78 38 64 63   74

00 00 00 01 67 4d 40 0c da 0f 0a 68 40 00 00 03 00 40 00 00 07 a3 c5 0a a8 00 00 00 01 68 ef 04 f2 00 00 01 06 05 ff ff 69 dc 45 e9 bd e6 d9 48 b7 96 2c d8 20 d9 23 ee ef 78 32 36 34 20 2d 20 63 6f 72 65 20 31 34 36 20 2d 20 48 2e 32 36 34 2f 4d 50 45 47 2d 34 20 41 56 43 20 63 6f 64 65 63 20 2d 20 43 6f 70 79 6c 65 66 74 20 32 30 30 33 2d 32 30 31 35 20 2d 20 68 74 74 70 3a 2f 2f 77 77 77 2e 76 69 64 65 6f 6c 61 6e 2e 6f 72 67 2f 78 32 36 34 2e 68 74 6d 6c 20 2d 20 6f 70 74 69 6f 6e 73 3a 20 63 61 62 61 63 3d 31 20 72 65 66 3d 31 20 64 65 62 6c 6f 63 6b 3d 31 3a 30 3a 30 20 61 6e 61 6c 79 73 65 3d 30 78 31 3a 30 78 31 20 6d 65 3d 68 65 78 20 73 75 62 6d 65 3d 30 20 70 73 79 3d 31 20 70 73 79 5f 72 64 3d 31 2e 30 30 3a 30 2e 30 30 20 6d 69 78 65 64 5f 72 65 66 3d 30 20 6d 65 5f 72 61 6e 67 65 3d 31 36 20 63 68 72 6f 6d 61 5f 6d 65 3d 31 20 74 72 65 6c 6c 69 73 3d 30 20 38 78 38 64 63 74

这是一个P帧:

00 00 00 01 41 9A 26 22 DF 76 4B B2 EF CF 57 AC 5B B6 3B 68 B9 87 B2   71 A5 9B 61 3C 93 47公元前79年C5 AB 0F 87 34 40 F6 6A CD 80 03 B1 A2 C2   4E 08 13 CD 4E 3C 62 3E 44 E8 0A 97 80 EC 81 3F 31 7C F1 29 F1 43 A0   C0 A9 0A 74 62 C7 62 74 DA C3 94 F5 19 23 FF 4B 9C C1 69 55 54 2F 62   F0 5E 64 7F 18 3F 58 73 AF 93 6E 92 06 FD 9F A1 1A 80 CF 86 71 24 7天   F7 56 2C C1​​ 57 CF巴05 17 77 18 F1 8B 3C 33 40 18 30 1F B0 19 23 44   欧盟91 C4 BD 80 65 4A 46 B3 1E 53 5D 6D A3 F0 B5 50 3A 93巴81 71 F3   09 98 41 43 BA 5F A1 0D 41 A3 7b的C3 FD EB 15 89 75 66 A9 EE 3A 9C 1B   C1 AA F8 58 10 88 0C 79 77 FF 7D 15 28 EB 12 A7 1B 76 36 AA 84 E1 3E   63比照A9 A3 CF 4A 2D C2 33 18 91 30 F7 3C 9C 56 F5 4C 12 6C 4B 12 1F   C5 EC 5A 98 8C 12 75 EB的fd 98 A4 FB 1408米80 5D 28 F9 EF 43 A4 0A CA 25   75 19 6B F7 14 7B 76 AF E9 8F 7D 79发9D 9A 63日1F是发6C 65 BA   5F 9D B0 B0 F4 71 CB E2 EA D6直流C6 55 98 1B CD 55 D9 EB 9C 75 FC 9D   EC

00 00 00 01 41 9a 26 22 df 76 4b b2 ef cf 57 ac 5b b6 3b 68 b9 87 b2 71 a5 9b 61 3c 93 47 bc 79 c5 ab 0f 87 34 f6 40 6a cd 80 03 b1 a2 c2 4e 08 13 cd 4e 3c 62 3e 44 0a e8 97 80 ec 81 3f 31 7c f1 29 f1 43 a0 c0 a9 0a 74 62 c7 62 74 da c3 94 f5 19 23 ff 4b 9c c1 69 55 54 2f 62 f0 5e 64 7f 18 3f 58 73 af 93 6e 92 06 fd 9f a1 1a 80 cf 86 71 24 7d f7 56 2c c1 57 cf ba 05 17 77 18 f1 8b 3c 33 40 18 30 1f b0 19 23 44 ec 91 c4 bd 80 65 4a 46 b3 1e 53 5d 6d a3 f0 b5 50 3a 93 ba 81 71 f3 09 98 41 43 ba 5f a1 0d 41 a3 7b c3 fd eb 15 89 75 66 a9 ee 3a 9c 1b c1 aa f8 58 10 88 0c 79 77 ff 7d 15 28 eb 12 a7 1b 76 36 aa 84 e1 3e 63 cf a9 a3 cf 4a 2d c2 33 18 91 30 f7 3c 9c 56 f5 4c 12 6c 4b 12 1f c5 ec 5a 98 8c 12 75 eb fd 98 a4 fb 7f 80 5d 28 f9 ef 43 a4 0a ca 25 75 19 6b f7 14 7b 76 af e9 8f 7d 79 fa 9d 9a 63 de 1f be fa 6c 65 ba 5f 9d b0 b0 f4 71 cb e2 ea d6 dc c6 55 98 1b cd 55 d9 eb 9c 75 fc 9d ec

我是pretty的肯定我流的正确性,因为我使用的ffmpeg 解码和 GLSurfaceview 渲染成功与 OpenGLES 2.0

I am pretty sure about my stream's correctness as I successfully rendered using ffmpeg decoding and GLSurfaceview with OpenGLES 2.0.

推荐答案

我把H.264倾倒无论是从本地层和Java层,发现原生层的转储完全被演奏但是Java层的转储演奏作为打破的去codeD流。问题是 - 通过连接codeD流的原生层到Java期间,EN codeD流没有被正确传递(损坏),这是因为我的车实现(对不起谁是下这个线程本不方便)。

I took H.264 dump both from native layer and Java layer and found the dump of native layer was played perfectly but the Java layer's dump was played as broken as the decoded stream. The problem was - during passing encoded streams from native layer to Java, the encoded stream was not passed properly(corrupted) and it was because of my buggy implementation (sorry to who were following this thread for this inconvenience).

另外我经过的I帧的有效载荷才去codeR导致破渲染。现在我通过完整的NAL单元(SPS + PPS +有效载荷)和现在的一切是好的:)

Moreover I was passing I-frame's payload only to decoder which resulted in broken rendering. Now I am passing complete NAL unit (SPS + PPS + payload) and everything is okay now :)

这篇关于视频渲染坏了媒体codeC H.264码流的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆