使用 MediaCodec 将一系列图像保存为视频 [英] Using MediaCodec to save series of images as Video

查看:40
本文介绍了使用 MediaCodec 将一系列图像保存为视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用 MediaCodec 将一系列图像(在文件中另存为字节数组)保存到视频文件中.我已经在 SurfaceView(连续播放)上测试了这些图像,我可以很好地看到它们.我看过很多使用 MediaCodec 的例子,这是我的理解(如果我错了,请纠正我):

<块引用>

从 MediaCodec 对象中获取 InputBuffers -> 用你的框架的图像数据 -> 排队输入缓冲区 -> 获取编码输出缓冲区 ->将其写入文件 -> 增加演示时间并重复

但是,我对此进行了大量测试,最终得到以下两种情况之一:

  • 我试图模仿的所有示例项目都导致媒体服务器在第二次调用 queueInputBuffer 时死机.
  • 我尝试在最后调用 codec.flush()(在将输出缓冲区保存到文件之后,虽然我看到的所有示例都没有这样做)并且媒体服务器没有死,但是,我我无法使用任何媒体播放器打开输出视频文件,所以出了点问题.

这是我的代码:

MediaCodec 编解码器 = MediaCodec.createEncoderByType(MIMETYPE);MediaFormat mediaFormat = null;if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 1280, 720);} 别的 {mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 720, 480);}mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);编解码器.开始();ByteBuffer[] inputBuffers = codec.getInputBuffers();ByteBuffer[] outputBuffers = codec.getOutputBuffers();布尔锯输入EOS =假;int inputBufferIndex=-1,outputBufferIndex=-1;缓冲区信息信息=空;//循环从文件中读取YUV字节数组inputBufferIndex = codec.dequeueInputBuffer(WAITTIME);if(bytesread<=0)sawInputEOS=true;如果(输入缓冲区索引 >= 0){if(!sawInputEOS){int samplesiz=dat.length;inputBuffers[inputBufferIndex].put(dat);codec.queueInputBuffer(inputBufferIndex, 0, samplesiz,presentationTime, 0);演示时间 += 100;info = new BufferInfo();outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);Log.i("BATA", "outputBufferIndex="+outputBufferIndex);如果(输出缓冲索引 >= 0){字节[]数组=新字节[信息大小];outputBuffers[outputBufferIndex].get(array);如果(数组!= null){尝试 {dos.write(数组);} catch (IOException e) {e.printStackTrace();}}codec.releaseOutputBuffer(outputBufferIndex, false);inputBuffers[inputBufferIndex].clear();outputBuffers[outputBufferIndex].clear();if(sawInputEOS) 中断;}}别的{codec.queueInputBuffer(inputBufferIndex, 0, 0,presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);info = new BufferInfo();outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);如果(输出缓冲索引 >= 0){字节[]数组=新字节[信息大小];outputBuffers[outputBufferIndex].get(array);如果(数组!= null){尝试 {dos.write(数组);} catch (IOException e) {e.printStackTrace();}}codec.releaseOutputBuffer(outputBufferIndex, false);inputBuffers[inputBufferIndex].clear();outputBuffers[outputBufferIndex].clear();休息;}}}}codec.flush();尝试 {fstream2.close();dos.flush();dos.close();} catch (IOException e) {e.printStackTrace();}编解码器.停止();编解码器.release();编解码器 = 空;返回真;}

我的问题是,如何使用 MediaCodec 从图像流中获取工作视频.我做错了什么?

另一个问题(如果我不是太贪心的话),我想在这个视频中添加一个音轨,是否也可以用 MediaCodec 来完成,还是必须使用 FFmpeg?

注意:我知道 Android 4.3 中的 MediaMux,但是,它不是我的选择,因为我的应用程序必须在 Android 4.1+ 上运行.

更新感谢fadden answer,我能够在没有媒体服务器死亡的情况下访问EOS(以上代码是修改后的).但是,我得到的文件正在产生胡言乱语.这是我得到的视频的快照(仅适用于 .h264 文件).

我的输入图像格式是 YUV 图像(来自相机预览的 NV21).我无法让它成为任何可播放的格式.我尝试了所有 COLOR_FormatYUV420 格式和相同的乱码输出.而且我仍然找不到(使用 MediaCodec)添加音频.

解决方案

我认为您的总体思路是正确的.需要注意的一些事项:

  • 并非所有设备都支持 COLOR_FormatYUV420SemiPlanar.有些只接受平面.(Android 4.3 引入了 CTS 测试以确保 AVC 编解码器支持其中一种.)
  • 排队输入缓冲区不会立即导致生成一个输出缓冲区.某些编解码器可能会在产生输出之前累积几帧输入,并且可能会在您的输入完成后产生输出.确保您的循环考虑到这一点(例如,如果您的 inputBuffers[].clear() 仍然是 -1,则会爆炸).
  • 不要尝试使用相同的 queueInputBuffer 调用提交数据和发送 EOS.该帧中的数据可能会被丢弃.始终使用零长度缓冲区发送 EOS.

编解码器的输出通常非常原始",例如AVC 编解码器发出 H.264 基本流,而不是熟"的 .mp4 文件.许多玩家不会接受这种格式.如果您不能依赖 MediaMuxer 的存在,您将需要找到另一种方法来处理数据(在 stackoverflow 上搜索想法).

媒体服务器进程当然不会崩溃.

您可以在此处找到一些示例和指向 4.3 CTS 测试的链接.

更新:从 Android 4.3 开始,MediaCodecCamera 没有共同的 ByteBuffer 格式,所以至少你需要摆弄色度平面.然而,这类问题的表现非常不同(如这个问题的图片所示).

您添加的图像看起来像视频,但存在步幅和/或对齐问题.确保您的像素布局正确.在 CTS EncodeDecodeTestgenerateFrame() 方法(第 906 行)展示了如何为 MediaCodec 编码平面和半平面 YUV420.

避免格式问题的最简单方法是通过 Surface(如 CameraToMpegTest 示例)移动帧,但不幸的是,这在 Android 4.1 中是不可能的.

I am trying to use MediaCodec to save a series of Images, saved as Byte Arrays in a file, to a video file. I have tested these images on a SurfaceView (playing them in series) and I can see them fine. I have looked at many examples using MediaCodec, and here is what I understand (please correct me if I am wrong):

Get InputBuffers from MediaCodec object -> fill it with your frame's image data -> queue the input buffer -> get coded output buffer -> write it to a file -> increase presentation time and repeat

However, I have tested this a lot and I end up with one of two cases:

  • All sample projects I tried to imitate have caused Media server to die when calling queueInputBuffer for the second time.
  • I tried calling codec.flush() at the end (after saving output buffer to file, although none of the examples I saw did this) and the media server did not die, however, I am not able to open the output video file with any media player, so something is wrong.

Here is my code:

MediaCodec codec = MediaCodec.createEncoderByType(MIMETYPE);
        MediaFormat mediaFormat = null;
        if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
            mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 1280 , 720);
        } else {
            mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 720, 480);
        }


        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
        codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

        codec.start();

        ByteBuffer[] inputBuffers = codec.getInputBuffers();
        ByteBuffer[] outputBuffers = codec.getOutputBuffers();
        boolean sawInputEOS = false;
        int inputBufferIndex=-1,outputBufferIndex=-1;
        BufferInfo info=null;

                    //loop to read YUV byte array from file

            inputBufferIndex = codec.dequeueInputBuffer(WAITTIME);
            if(bytesread<=0)sawInputEOS=true;

            if(inputBufferIndex >= 0){
                if(!sawInputEOS){
                    int samplesiz=dat.length;
                    inputBuffers[inputBufferIndex].put(dat);
                    codec.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
                    presentationTime += 100;

                    info = new BufferInfo();
                    outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
                    Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
                    if(outputBufferIndex >= 0){
                        byte[] array = new byte[info.size];
                        outputBuffers[outputBufferIndex].get(array);

                        if(array != null){
                            try {
                                dos.write(array);
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }

                        codec.releaseOutputBuffer(outputBufferIndex, false);
                        inputBuffers[inputBufferIndex].clear();
                        outputBuffers[outputBufferIndex].clear();

                        if(sawInputEOS) break;
                    }
                }else{
                    codec.queueInputBuffer(inputBufferIndex, 0, 0, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);

                    info = new BufferInfo();
                    outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);

                    if(outputBufferIndex >= 0){
                        byte[] array = new byte[info.size];
                        outputBuffers[outputBufferIndex].get(array);

                        if(array != null){
                            try {
                                dos.write(array);
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }

                        codec.releaseOutputBuffer(outputBufferIndex, false);
                        inputBuffers[inputBufferIndex].clear();
                        outputBuffers[outputBufferIndex].clear();
                        break;
                    }
                }


            }
        }

        codec.flush();

        try {
            fstream2.close();
            dos.flush();
            dos.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
        codec.stop();
        codec.release();
        codec = null;

        return true;

    }

My question is, how can I get a working video from a stream of images using MediaCodec. What am I doing wrong?

Another question (if I am not too greedy), I would like to add an Audio track to this video, can it be done with MediaCodec as well, or must I use FFmpeg?

Note: I know about MediaMux in Android 4.3, however, it is not an option for me as my App must work on Android 4.1+.

Update Thanks to fadden answer, I was able to reach EOS without Media server dying (Above code is after modification). However, the file I am getting is producing gibberish. Here is a snapshot of the video I get (only works as .h264 file).

My Input image format is YUV image (NV21 from camera preview). I can't get it to be any playable format. I tried all COLOR_FormatYUV420 formats and same gibberish output. And I still can't find away (using MediaCodec) to add audio.

解决方案

I think you have the right general idea. Some things to be aware of:

  • Not all devices support COLOR_FormatYUV420SemiPlanar. Some only accept planar. (Android 4.3 introduced CTS tests to ensure that the AVC codec supports one or the other.)
  • It's not the case that queueing an input buffer will immediately result in the generation of one output buffer. Some codecs may accumulate several frames of input before producing output, and may produce output after your input has finished. Make sure your loops take that into account (e.g. your inputBuffers[].clear() will blow up if it's still -1).
  • Don't try to submit data and send EOS with the same queueInputBuffer call. The data in that frame may be discarded. Always send EOS with a zero-length buffer.

The output of the codecs is generally pretty "raw", e.g. the AVC codec emits an H.264 elementary stream rather than a "cooked" .mp4 file. Many players won't accept this format. If you can't rely on the presence of MediaMuxer you will need to find another way to cook the data (search around on stackoverflow for ideas).

It's certainly not expected that the mediaserver process would crash.

You can find some examples and links to the 4.3 CTS tests here.

Update: As of Android 4.3, MediaCodec and Camera have no ByteBuffer formats in common, so at the very least you will need to fiddle with the chroma planes. However, that sort of problem manifests very differently (as shown in the images for this question).

The image you added looks like video, but with stride and/or alignment issues. Make sure your pixels are laid out correctly. In the CTS EncodeDecodeTest, the generateFrame() method (line 906) shows how to encode both planar and semi-planar YUV420 for MediaCodec.

The easiest way to avoid the format issues is to move the frames through a Surface (like the CameraToMpegTest sample), but unfortunately that's not possible in Android 4.1.

这篇关于使用 MediaCodec 将一系列图像保存为视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆