使用Media codeC保存一系列图像的视频 [英] Using MediaCodec to save series of images as Video

查看:1412
本文介绍了使用Media codeC保存一系列图像的视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想使用媒体codeC 来保存一系列图像,保存为字节数组中的文件,到视频文件。我已在测试过这些图像的 SurfaceView (播放他们在系列),我可以罚款看到他们。我已经看过用很多例子媒体codeC ,这里是我的理解(请纠正我,如果我错了):

I am trying to use MediaCodec to save a series of Images, saved as Byte Arrays in a file, to a video file. I have tested these images on a SurfaceView (playing them in series) and I can see them fine. I have looked at many examples using MediaCodec, and here is what I understand (please correct me if I am wrong):

获取InputBuffers从媒体codeC对象 - >与相框的填充   图像数据 - >队列输入缓冲区 - >得到codeD输出缓冲 - >   其写入文件 - >增加presentation时间和重复

Get InputBuffers from MediaCodec object -> fill it with your frame's image data -> queue the input buffer -> get coded output buffer -> write it to a file -> increase presentation time and repeat

不过,我已经测试这个有很多,我结束了两种情况之一:

However, I have tested this a lot and I end up with one of two cases:

  • 在所有样本项目,我试图模仿引起媒体服务器调用的时候死了 queueInputBuffer 第二次。
  • 在我打过电话 codec.flush()末(节省输出缓冲到文件,但没有一个例子,我看到这样做之后)和媒体服务器并没有死,但我无法打开任何媒体播放器输出的视频文件,所以什么是错的。
  • All sample projects I tried to imitate have caused Media server to die when calling queueInputBuffer for the second time.
  • I tried calling codec.flush() at the end (after saving output buffer to file, although none of the examples I saw did this) and the media server did not die, however, I am not able to open the output video file with any media player, so something is wrong.

下面是我的code:

MediaCodec codec = MediaCodec.createEncoderByType(MIMETYPE);
        MediaFormat mediaFormat = null;
        if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
            mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 1280 , 720);
        } else {
            mediaFormat = MediaFormat.createVideoFormat(MIMETYPE, 720, 480);
        }


        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
        codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

        codec.start();

        ByteBuffer[] inputBuffers = codec.getInputBuffers();
        ByteBuffer[] outputBuffers = codec.getOutputBuffers();
        boolean sawInputEOS = false;
        int inputBufferIndex=-1,outputBufferIndex=-1;
        BufferInfo info=null;

                    //loop to read YUV byte array from file

            inputBufferIndex = codec.dequeueInputBuffer(WAITTIME);
            if(bytesread<=0)sawInputEOS=true;

            if(inputBufferIndex >= 0){
                if(!sawInputEOS){
                    int samplesiz=dat.length;
                    inputBuffers[inputBufferIndex].put(dat);
                    codec.queueInputBuffer(inputBufferIndex, 0, samplesiz, presentationTime, 0);
                    presentationTime += 100;

                    info = new BufferInfo();
                    outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);
                    Log.i("BATA", "outputBufferIndex="+outputBufferIndex);
                    if(outputBufferIndex >= 0){
                        byte[] array = new byte[info.size];
                        outputBuffers[outputBufferIndex].get(array);

                        if(array != null){
                            try {
                                dos.write(array);
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }

                        codec.releaseOutputBuffer(outputBufferIndex, false);
                        inputBuffers[inputBufferIndex].clear();
                        outputBuffers[outputBufferIndex].clear();

                        if(sawInputEOS) break;
                    }
                }else{
                    codec.queueInputBuffer(inputBufferIndex, 0, 0, presentationTime, MediaCodec.BUFFER_FLAG_END_OF_STREAM);

                    info = new BufferInfo();
                    outputBufferIndex = codec.dequeueOutputBuffer(info, WAITTIME);

                    if(outputBufferIndex >= 0){
                        byte[] array = new byte[info.size];
                        outputBuffers[outputBufferIndex].get(array);

                        if(array != null){
                            try {
                                dos.write(array);
                            } catch (IOException e) {
                                e.printStackTrace();
                            }
                        }

                        codec.releaseOutputBuffer(outputBufferIndex, false);
                        inputBuffers[inputBufferIndex].clear();
                        outputBuffers[outputBufferIndex].clear();
                        break;
                    }
                }


            }
        }

        codec.flush();

        try {
            fstream2.close();
            dos.flush();
            dos.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
        codec.stop();
        codec.release();
        codec = null;

        return true;

    }

我的问题是,我该如何使用媒体codeC图像流得到工作的视频。我究竟做错了什么?

My question is, how can I get a working video from a stream of images using MediaCodec. What am I doing wrong?

另一个问题(如果我不是太贪婪),我想一个音轨添加到该视频,可以将其与媒体codeC进行为好,或者我必须使用FFmpeg的?

Another question (if I am not too greedy), I would like to add an Audio track to this video, can it be done with MediaCodec as well, or must I use FFmpeg?

注意:我知道 MediaMux 中的Andr​​oid 4.3,但是,它是不是一种选择,我为我的应用程序必须工作在Android 4.1 +

Note: I know about MediaMux in Android 4.3, however, it is not an option for me as my App must work on Android 4.1+.

更新 由于法登的答案,我能够达到EOS没有媒体服务器垂死(以上code是修改后)。不过,我收到的文件是生产乱码。下面是我得到的(仅适用于为.h264文件)的视频。

Update Thanks to fadden answer, I was able to reach EOS without Media server dying (Above code is after modification). However, the file I am getting is producing gibberish. Here is a snapshot of the video I get (only works as .h264 file).

我的输入图像格式是YUV图像(从摄像机preVIEW NV21)。我不能让它成为任何播放格式。我尝试了所有COLOR_FormatYUV420格式和相同的乱码输出。我仍然无法找到除了爱情之外(使用媒体codeC)添加音频。

My Input image format is YUV image (NV21 from camera preview). I can't get it to be any playable format. I tried all COLOR_FormatYUV420 formats and same gibberish output. And I still can't find away (using MediaCodec) to add audio.

推荐答案

我觉得你有正确的总体思路。有些事情需要注意的:

I think you have the right general idea. Some things to be aware of:

  • 并非所有设备都支持 COLOR_FormatYUV420SemiPlanar 。有的只接受平面。 (安卓4.3推出CTS测试,以确保AVC codeC支持一方或另一方。)
  • 这不是的情况下,排队一个输入缓冲器将立即导致一个输出缓冲区的产生。有些codeCS可能会累积投入几帧产生输出之前,并可能产生输出后,您的输入已完成。确保您的循环考虑到这一点(例如你的 inputBuffers []。明确的()将炸毁,如果它仍然-1)。
  • 请不要试图提交数据并发送EOS具有相同 queueInputBuffer 通话。在该帧中的数据可能被丢弃。总是以零长度的缓冲区发送EOS。
  • Not all devices support COLOR_FormatYUV420SemiPlanar. Some only accept planar. (Android 4.3 introduced CTS tests to ensure that the AVC codec supports one or the other.)
  • It's not the case that queueing an input buffer will immediately result in the generation of one output buffer. Some codecs may accumulate several frames of input before producing output, and may produce output after your input has finished. Make sure your loops take that into account (e.g. your inputBuffers[].clear() will blow up if it's still -1).
  • Don't try to submit data and send EOS with the same queueInputBuffer call. The data in that frame may be discarded. Always send EOS with a zero-length buffer.

的codeCS的输出通常是pretty的原始,如:在AVC codeC发出了H.264基本流,而不是一个熟的MP4文件。很多玩家不接受这种格式。如果你不能依靠 MediaMuxer ,你需要找到另一种方式来烹调数​​据(搜索周围的计算器的想法)在presence。

The output of the codecs is generally pretty "raw", e.g. the AVC codec emits an H.264 elementary stream rather than a "cooked" .mp4 file. Many players won't accept this format. If you can't rely on the presence of MediaMuxer you will need to find another way to cook the data (search around on stackoverflow for ideas).

这当然不是预期,媒体服务器进程将崩溃。

It's certainly not expected that the mediaserver process would crash.

您可以找到一些例子和链接4.3 CTS测试这里

You can find some examples and links to the 4.3 CTS tests here.

更新:作为的Andr​​oid 4.3,媒体codeC 摄像机没有ByteBuffer的格式普遍,所以最起码,你需要摆弄色度平面。然而,诸如此类的问题表现非常不同的(如图中的图像这个问题)。

Update: As of Android 4.3, MediaCodec and Camera have no ByteBuffer formats in common, so at the very least you will need to fiddle with the chroma planes. However, that sort of problem manifests very differently (as shown in the images for this question).

图片您添加的模样视频,但步幅和/或调整的问题。确保你的像素被正确地布局。在CTS <一href="https://android.googlesource.com/platform/cts/+/jb-mr2-release/tests/tests/media/src/android/media/cts/En$c$cDe$c$cTest.java">En$c$cDe$c$cTest,在 generateFrame()办法(线906),显示了如何连接code平面和半平面YUV420为媒体codeC

The image you added looks like video, but with stride and/or alignment issues. Make sure your pixels are laid out correctly. In the CTS EncodeDecodeTest, the generateFrame() method (line 906) shows how to encode both planar and semi-planar YUV420 for MediaCodec.

要避免的格式问题,最简单的方法就是通过表面移动帧(如CameraToMpegTest样品),但不幸的是,这不是可能的安卓4.1。

The easiest way to avoid the format issues is to move the frames through a Surface (like the CameraToMpegTest sample), but unfortunately that's not possible in Android 4.1.

这篇关于使用Media codeC保存一系列图像的视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆