如何使用的ByteBuffer在Android的媒体codeC环境 [英] How to use ByteBuffer in the MediaCodec context in android

查看:962
本文介绍了如何使用的ByteBuffer在Android的媒体codeC环境的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

到目前为止,我能够建立一个媒体codeC为en codeA的视频流。这样做的目的是为了节省我的用户产生的作品为视频文件。

我用的是用户的艺术品Android的位图对象推帧到流。

请参阅我在这篇文章底部使用code段(它是全code没有被修剪):

媒体codeC使用的ByteBuffer来处理视频/音频流。

位图是基于INT []这要是转换成byte []的要求X4的int的长度[]

我做了一些研究,以找出合同什么有适当的ByteBuffer的时候,在媒体codeC视频/音频流处理,但信息是几乎接近齐尔希。

所以, 什么是媒体codeC ByteBuffer的使用合同?

指定是否在MediaFormat帧尺寸自动意味着的ByteBuffers具有宽*高* 4个字节的容量?

(我使用的是位图对象在同一时间对每帧)

感谢您的帮助。

(编辑整理,code加)

 进口java.io.ByteArrayOutputStream中;
    进口java.io.DataOutputStream中;
    进口的java.io.File;
    进口java.io.FileOutputStream中;
    进口java.nio.ByteBuffer中;

    进口android.graphics.Rect;
    进口android.graphics.Bitmap.Com pressFormat;
    进口android.media.Media codeC;
    进口android.media.Media codec.BufferInfo;
    进口android.media.CamcorderProfile;
    进口android.media.Media codecInfo;
    进口android.media.MediaFormat;
    进口android.util.Log;
    进口android.view.View;

    公共类VideoCaptureManager {

        私人布尔运行;

        专用长presentationTime;

        公共无效启动(查看rootView,串SAVEFILEPATH){
            Log.e(OUT,SAVEFILEPATH);
            this.running = TRUE;
            这presentationTime = 0;
            this.capture(rootView,SAVEFILEPATH);
        }

        私人无效捕捉(最终查看rootView,串SAVEFILEPATH){
            如果(rootView!= NULL){
                rootView.setDrawingCacheEnabled(真正的);

                最后的矩形drawingRect =新的矩形();
                rootView.getDrawingRect(drawingRect);

                尝试{
                    最终文件的文件=新的文件(SAVEFILEPATH);
                    如果(file.exists()){
                        //文件存在回归
                        返回;
                    } 其他 {
                        文件的父= file.getParentFile();
                        如果(!parent.exists()){
                            parent.mkdirs();
                        }
                    }

            新的Thread(){
                公共无效的run(){
                    尝试{
                        DataOutputStream类DOS =新DataOutputStream类(新的FileOutputStream(文件));

                        媒体codeC codeC =媒体codec.createEn coderByType(视频/ MP4V-ES);
                        MediaFormat mediaFormat = NULL;
                        如果(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
                            mediaFormat = MediaFormat.createVideoFormat(视频/ MP4V-ES,720,1280);
                        } 其他 {
                            mediaFormat = MediaFormat.createVideoFormat(视频/ MP4V-ES,480,720);
                        }


                        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE,700000);
                        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE,10);
                        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,媒体codecInfo codecCapabilities.COLOR_FormatYUV420Planar。);
                        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,5);
                        codec.configure(mediaFormat,NULL,NULL,媒体codec.CONFIGURE_FLAG_EN code);

                        codec.start();

                        ByteBuffer的[] inputBuffers = codec.getInputBuffers();
                        ByteBuffer的[] outputBuffers = codec.getOutputBuffers();

                        而(VideoCaptureManager.this.running){
                            尝试{
                                INT inputBufferIndex = codec.dequeueInputBuffer(-2);
                                如果(inputBufferIndex> = 0){
                                    //填写位字节
                                    // inputBuffers [inputBufferIndex。
                                    ByteArrayOutputStream BAOS =新ByteArrayOutputStream();
                                    rootView.getDrawingCache()的COM preSS(比较pressFormat.JPEG,80,BAOS)。
                                    inputBuffers [inputBufferIndex]。放(baos.toByteArray());

                                    codec.queueInputBuffer(inputBufferIndex,0,inputBuffers [inputBufferIndex] .capacity(),presentationTime,媒体codec.BUFFER_FLAG_ codeC_CONFIG);
                                    presentationTime + = 100;
                                }

                                BufferInfo信息=新BufferInfo();
                                INT outputBufferIndex = codec.dequeueOutputBuffer(资讯,-2);
                                如果(outputBufferIndex> = 0){
                                    //写字节文件
                                    byte []数组= outputBuffers [outputBufferIndex] .array(); //这个THORWS异常。什么是处理的ByteBuffer在这个code中的合同?
                                    如果(阵!= NULL){
                                        dos.write(阵列);
                                    }

                                    codec.releaseOutputBuffer(outputBufferIndex,假);
                                }否则,如果(outputBufferIndex ==媒体codec.INFO_OUTPUT_BUFFERS_CHANGED){
                                    outputBuffers = codec.getOutputBuffers();
                                }否则,如果(outputBufferIndex ==媒体codec.INFO_OUTPUT_FORMAT_CHANGED){
                                    // codeC格式改变
                                    MediaFormat格式= codec.getOutputFormat();
                                }

                                视频下载(100);
                            }赶上(的Throwable日){
                                Log.e(OUT,th.getMessage(),日);
                            }
                        }

                        codec.stop();
                        codec.release();
                        codeC = NULL;

                        dos.flush();
                        dos.close();
                    }赶上(的Throwable日){
                        Log.e(OUT,th.getMessage(),日);
                    }
                }
                    }。开始();

                }赶上(的Throwable日){
                    Log.e(OUT,th.getMessage(),日);
                }
            }
        }

        公共无效停止(){
            this.running = FALSE;
        }
    }
 

解决方案

的确切布局的ByteBuffer 由codeC的输入格式确定你'已经选择。并非所有设备都支持所有可能的输入格式(例如,一些AVC连接codeRS需要平面420 YUV,有些则需要半平面)。旧版本的Andr​​oid(小于= 17 API)并没有真正提供一个可移植的方式,以软件生成视频帧媒体codeC

在的Andr​​oid 4.3(API 18),你有两个选择。首先,媒体codeC 现在接受从表面,这意味着什么,你可以用OpenGL ES绘制可以记录作为一个电影的输入。见,例如,恩codeAndMuxTest样品

二,你仍然有使用软件生成的YUV 420缓冲区的选项,但现在他们更容易的工作,因为那里是锻炼他们CTS测试。你仍然需要做的运行时检测平面或半平面的,但真的只有两个布局。请参阅恩codeDE codeTEST 一个例子。

So far I am able to setup a MediaCodec to encode a video stream. The aim is to save my user generated artwork into a video file.

I use android Bitmap objects of the user artwork to push frames into the stream.

See the code snippet I use at the bottom of this post (it is the full code nothing is trimmed):

MediaCodec uses ByteBuffer to deal with video/audio streams.

Bitmaps are based on int[] which if converted to byte[] will require x4 the size of the int[]

I did some research to figure out what contracts are there in place for the ByteBuffer when dealing with video/audio streams in MediaCodec, but the information is almost close to zilch.

So, what are the ByteBuffer usage contracts in MediaCodec?

Does specifying the frame dimensions in the MediaFormat automatically mean that the ByteBuffers have width * height * 4 bytes capacity?

(I use a bitmap object at a time for each frame)

Thanks for any help.

(edited, code added)

    import java.io.ByteArrayOutputStream;
    import java.io.DataOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.nio.ByteBuffer;

    import android.graphics.Rect;
    import android.graphics.Bitmap.CompressFormat;
    import android.media.MediaCodec;
    import android.media.MediaCodec.BufferInfo;
    import android.media.CamcorderProfile;
    import android.media.MediaCodecInfo;
    import android.media.MediaFormat;
    import android.util.Log;
    import android.view.View;

    public class VideoCaptureManager {

        private boolean running;

        private long presentationTime;

        public void start(View rootView, String saveFilePath){
            Log.e("OUT", saveFilePath);
            this.running = true;
            this.presentationTime = 0;
            this.capture(rootView, saveFilePath);
        }

        private void capture(final View rootView, String saveFilePath){
            if(rootView != null){
                rootView.setDrawingCacheEnabled(true);

                final Rect drawingRect = new Rect();
                rootView.getDrawingRect(drawingRect);

                try{
                    final File file = new File(saveFilePath);
                    if(file.exists()){
                        // File exists return
                        return;
                    } else {
                        File parent = file.getParentFile();
                        if(!parent.exists()){
                            parent.mkdirs();
                        }
                    }

            new Thread(){
                public void run(){
                    try{
                        DataOutputStream dos = new DataOutputStream(new FileOutputStream(file));

                        MediaCodec codec = MediaCodec.createEncoderByType("video/mp4v-es");
                        MediaFormat mediaFormat = null;
                        if(CamcorderProfile.hasProfile(CamcorderProfile.QUALITY_720P)){
                            mediaFormat = MediaFormat.createVideoFormat("video/mp4v-es", 720, 1280);
                        } else {
                            mediaFormat = MediaFormat.createVideoFormat("video/mp4v-es", 480, 720);
                        }


                        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 700000);
                        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 10);
                        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
                        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
                        codec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

                        codec.start();

                        ByteBuffer[] inputBuffers = codec.getInputBuffers();
                        ByteBuffer[] outputBuffers = codec.getOutputBuffers();

                        while(VideoCaptureManager.this.running){
                            try{
                                int inputBufferIndex = codec.dequeueInputBuffer(-2);
                                if(inputBufferIndex >= 0){
                                    // Fill in the bitmap bytes
                                    // inputBuffers[inputBufferIndex].
                                    ByteArrayOutputStream baos = new ByteArrayOutputStream();
                                    rootView.getDrawingCache().compress(CompressFormat.JPEG, 80, baos);
                                    inputBuffers[inputBufferIndex].put(baos.toByteArray());

                                    codec.queueInputBuffer(inputBufferIndex, 0, inputBuffers[inputBufferIndex].capacity(), presentationTime, MediaCodec.BUFFER_FLAG_CODEC_CONFIG);
                                    presentationTime += 100;
                                }

                                BufferInfo info = new BufferInfo();
                                int outputBufferIndex = codec.dequeueOutputBuffer(info, -2);
                                if(outputBufferIndex >= 0){
                                    // Write the bytes to file
                                    byte[] array = outputBuffers[outputBufferIndex].array(); // THIS THORWS AN EXCEPTION. WHAT IS THE CONTRACT TO DEAL WITH ByteBuffer in this code?
                                    if(array != null){
                                        dos.write(array);
                                    }

                                    codec.releaseOutputBuffer(outputBufferIndex, false);
                                } else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED){
                                    outputBuffers = codec.getOutputBuffers();
                                } else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED){
                                    // codec format is changed
                                    MediaFormat format = codec.getOutputFormat();
                                }

                                Thread.sleep(100);
                            }catch(Throwable th){
                                Log.e("OUT", th.getMessage(), th);
                            }
                        }

                        codec.stop();
                        codec.release();
                        codec = null;

                        dos.flush();
                        dos.close();
                    }catch(Throwable th){
                        Log.e("OUT", th.getMessage(), th);
                    }
                }
                    }.start();

                }catch(Throwable th){
                    Log.e("OUT", th.getMessage(), th);
                }
            }
        }

        public void stop(){
            this.running = false;
        }
    }

解决方案

The exact layout of the ByteBuffer is determined by the codec for the input format you've chosen. Not all devices support all possible input formats (e.g. some AVC encoders require planar 420 YUV, others require semi-planar). Older versions of Android (<= API 17) didn't really provide a portable way to software-generate video frames for MediaCodec.

In Android 4.3 (API 18), you have two options. First, MediaCodec now accepts input from a Surface, which means anything you can draw with OpenGL ES can be recorded as a movie. See, for example, the EncodeAndMuxTest sample.

Second, you still have the option of using software-generated YUV 420 buffers, but now they're more likely to work because there are CTS tests that exercise them. You still have to do runtime detection of planar or semi-planar, but there's really only two layouts. See the buffer-to-buffer variants of the EncodeDecodeTest for an example.

这篇关于如何使用的ByteBuffer在Android的媒体codeC环境的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆