在Android的媒体codeC的H.264 AVC视频连接$ C $光盘无法播放 [英] The H.264 avc video encoded by MediaCodec in Android cannot be played

查看:364
本文介绍了在Android的媒体codeC的H.264 AVC视频连接$ C $光盘无法播放的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

背景:

我一直在努力实现像录像机中的藤两天。首先,我试过MediaRecorder。但我需要的视频可能是小的视频片段组成。这个类不能被用于记录一个短时视频剪辑。然后我发现媒体codeC,FFmpeg的和JavaCV。 FFmpeg的和JavaCV可以解决这个问题。但我编译我的项目有许多库文件。它会产生一个非常大的APK文件。所以我preFER由媒体codeC实现它,虽然只有这个类可以是Android 4.1后才能使用。 90%的用户百分比将得到满足。

结果:

我终于得到了EN codeD文件,但它不能播放。我检查由FFprobe的信息,结果是这样的:


  

输入#0,H264,从'test.mp4:
    时间:N / A,比特率:N / A
      流#0:0:视频:H264(基线),YUV420P,640×480,每秒25帧,25 TBR,1200K TBN,50 TBC


我不知道很多关于H.264编码的机制。

code:

从这个<改良href=\"http://stackoverflow.com/questions/13458289/encoding-h-264-from-camera-with-android-media$c$cc\">link

 公共类AvcEn $ C $ {CR私有静态字符串标记= AvcEn coder.class.getSimpleName();私营媒体codeC媒体codeC;
私人的BufferedOutputStream的OutputStream;
私人诠释mWidth,mHeight;
私人字节[] mDestData;公共AvcEn codeR(INT W,INT高){    mWidth = W;
    mHeight = H;
    Log.d(TAG,线程ID:+ Thread.currentThread()的getId());    文件f =新的文件(/ SD卡/视频/ test.mp4);    尝试{
        的OutputStream =新的BufferedOutputStream(新的FileOutputStream(f)条);
        Log.i(AvcEn codeR,初始化的OutputStream);
    }赶上(例外五){
        e.printStackTrace();
    }    尝试{
        媒体codeC =媒体codec.createEn coderByType(视频/ AVC);
    }赶上(IOException异常五){
        // TODO自动生成catch块
        e.printStackTrace();
    }
    MediaFormat mediaFormat = MediaFormat.createVideoFormat(视频/ AVC,W,
            H);
    mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE,2000000);
    mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE,15);
    // mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
    //媒体codecInfo codecCapabilities.COLOR_FormatYUV420Planar)。
    mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
            媒体codecInfo codecCapabilities.COLOR_FormatYUV420Planar)。    mDestData =新的字节[W * H
            * ImageFormat.getBitsPerPixel(ImageFormat.YV12)/ 8];
    mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,5);
    媒体codec.configure(mediaFormat,NULL,NULL,
            媒体codec.CONFIGURE_FLAG_EN code);
    媒体codec.start();
}公共无效的close(){
    尝试{
        媒体codec.stop();
        媒体codec.release();
        媒体codeC = NULL;        // outputStream.flush();
        outputStream.close();
    }赶上(IOException异常五){    }
}公共无效offerEn codeR(字节[]输入){
    尝试{
        CameraUtils.transYV12toYUV420Planar(输入,mDestData,mWidth,
                mHeight);
        ByteBuffer的[] inputBuffers =媒体codec.getInputBuffers();
        ByteBuffer的[] outputBuffers =媒体codec.getOutputBuffers();
        INT inputBufferIndex =媒体codec.dequeueInputBuffer(-1);        如果(inputBufferIndex&GT; = 0){
            ByteBuffer的INPUTBUFFER = inputBuffers [inputBufferIndex]
            inputBuffer.clear();
            inputBuffer.put(mDestData);
            媒体codec.queueInputBuffer(inputBufferIndex,0,
                    mDestData.length,0,0);
        }        媒体codec.BufferInfo bufferInfo =新媒体codec.BufferInfo();
        INT outputBufferIndex =媒体codec.dequeueOutputBuffer(bufferInfo,
                0);        而(outputBufferIndex&GT; = 0){
            ByteBuffer的OutputBuffer中= outputBuffers [outputBufferIndex]
            字节[] = outData新的字节[bufferInfo.size]
            outputBuffer.get(outData)以;
            尝试{
                outputStream.write(outData,0,outData.length);            }赶上(例外五){
                Log.d(AvcEn codeR,为OutputStream写失败);
                e.printStackTrace();
            }
            // Log.i(AvcEn codeR,outData.length +写字节);            媒体codec.releaseOutputBuffer(outputBufferIndex,FALSE);
            outputBufferIndex =媒体codec.dequeueOutputBuffer(bufferInfo,
                    0);        }
    }赶上(的Throwable t)的{
        t.printStackTrace();
    }
}
}

通过相机的启动preVIEW调用这个类:

 私人无效开始preVIEW(){
    如果(mCamera == NULL){
        返回;
    }
    尝试{
        mCamera.set previewDisplay(mSurfaceView.getHolder());
        参数P = mCamera.getParameters();
        大小S = p.get previewSize();
        INT LEN = s.width * s.height
                * ImageFormat.getBitsPerPixel(p.get previewFormat())/ 8;
        mAvcEn codeR =新AvcEn codeR(s.width,s.height);
        mCamera.addCallbackBuffer(新字节[LEN]);
        mCamera.set previewCallbackWithBuffer(新previewCallback(){            @覆盖
            在previewFrame(字节[]数据,相机摄像头){公共无效
                mAvcEn coder.offerEn codeR(数据);
                mCamera.addCallbackBuffer(数据);
            }
        });
        mCamera.start preVIEW();
    }赶上(IOException异常五){
        e.printStackTrace();
    }
}

关闭它,当释放相机:

 私人无效releaseCamera(){
    如果(mCamera!= NULL){
        mCamera.stop preVIEW();
        mCamera.release();
        mCamera = NULL;
    }
    如果(mAvcEn codeR!= NULL){
        mAvcEn coder.close();
    }
}


解决方案

您正在保存原始H.264流。你应该将其转换为.MP4​​格式。要做到这一点,最简单的方法是使用 MediaMuxer 类(API 18+ )。

您可以找到 bigflake 以及更完整的一个简单的例子在实例 Grafika

您需要为每一帧提供presentation时间戳。您可以根据您所需的帧速率(如bigf​​lake例子)生成它们或从源(如在Grafika相机输入的例子)获得它们。

编辑:对于pre-API-18装置(Android 4.1 / 4.2),媒体codeC是更困难的工作。您不能使用表面输入或MediaMuxer,缺乏平台测试导致了一些不幸的不兼容性。 这个答案有一个概述。

在特定的情况下,我会注意到你的样品code试图指定输入格式,但没有任何效果 - AVC codeC定义了输入格式它接受,而且您的应用程序必须查询它。您可能会发现,在你带codeD视频的颜色是<一个href=\"http://stackoverflow.com/questions/13703596/media$c$cc-and-camera-colorspaces-dont-match\">currently错,如相机和媒体codeC没有任何共同的色彩格式(见这个问题的答案色彩交换code)。

BACKGROUND:

I have been working on implementing a Vine like video recorder for two days. First, I tried the MediaRecorder. But the video I need may be composed by small video clips. This class cannot be used to record a short-time video clip. Then I found the MediaCodec, FFmpeg and JavaCV. FFmpeg and JavaCV could solve this problem. But I have to compile my project with many library files. It will generate a very large APK file. So I prefer implementing it by MediaCodec, although this class only can be used after Android 4.1. 90% percent users will be satisfied.

RESULT:

I finally got the encoded file, but it cannot be played. I checked the information by FFprobe, the result is like:

Input #0, h264, from 'test.mp4': Duration: N/A, bitrate: N/A Stream #0:0: Video: h264 (Baseline), yuv420p, 640x480, 25 fps, 25 tbr, 1200k tbn, 50 tbc

I do not know much about the mechanism of H.264 coding.

CODE:

Modified from this link

public class AvcEncoder {

private static String TAG = AvcEncoder.class.getSimpleName();

private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
private int mWidth, mHeight;
private byte[] mDestData;

public AvcEncoder(int w, int h) {

    mWidth = w;
    mHeight = h;
    Log.d(TAG, "Thread Id: " + Thread.currentThread().getId());

    File f = new File("/sdcard/videos/test.mp4");

    try {
        outputStream = new BufferedOutputStream(new FileOutputStream(f));
        Log.i("AvcEncoder", "outputStream initialized");
    } catch (Exception e) {
        e.printStackTrace();
    }

    try {
        mediaCodec = MediaCodec.createEncoderByType("video/avc");
    } catch (IOException e) {
        // TODO Auto-generated catch block
        e.printStackTrace();
    }
    MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", w,
            h);
    mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 2000000);
    mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 15);
    // mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
    // MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);
    mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
            MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar);

    mDestData = new byte[w * h
            * ImageFormat.getBitsPerPixel(ImageFormat.YV12) / 8];
    mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);
    mediaCodec.configure(mediaFormat, null, null,
            MediaCodec.CONFIGURE_FLAG_ENCODE);
    mediaCodec.start();
}

public void close() {
    try {
        mediaCodec.stop();
        mediaCodec.release();
        mediaCodec = null;

        // outputStream.flush();
        outputStream.close();
    } catch (IOException e) {

    }
}

public void offerEncoder(byte[] input) {
    try {
        CameraUtils.transYV12toYUV420Planar(input, mDestData, mWidth,
                mHeight);
        ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
        ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
        int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);

        if (inputBufferIndex >= 0) {
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(mDestData);
            mediaCodec.queueInputBuffer(inputBufferIndex, 0,
                    mDestData.length, 0, 0);
        }

        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
        int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,
                0);

        while (outputBufferIndex >= 0) {
            ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
            byte[] outData = new byte[bufferInfo.size];
            outputBuffer.get(outData);
            try {
                outputStream.write(outData, 0, outData.length);

            } catch (Exception e) {
                Log.d("AvcEncoder", "Outputstream write failed");
                e.printStackTrace();
            }
            // Log.i("AvcEncoder", outData.length + " bytes written");

            mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
            outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,
                    0);

        }
    } catch (Throwable t) {
        t.printStackTrace();
    }
}
}

Invoke this class by Camera's startPreview:

private void startPreview() {
    if (mCamera == null) {
        return;
    }
    try {
        mCamera.setPreviewDisplay(mSurfaceView.getHolder());
        Parameters p = mCamera.getParameters();
        Size s = p.getPreviewSize();
        int len = s.width * s.height
                * ImageFormat.getBitsPerPixel(p.getPreviewFormat()) / 8;
        mAvcEncoder = new AvcEncoder(s.width, s.height);
        mCamera.addCallbackBuffer(new byte[len]);
        mCamera.setPreviewCallbackWithBuffer(new PreviewCallback() {

            @Override
            public void onPreviewFrame(byte[] data, Camera camera) {
                mAvcEncoder.offerEncoder(data);
                mCamera.addCallbackBuffer(data);
            }
        });
        mCamera.startPreview();
    } catch (IOException e) {
        e.printStackTrace();
    }
}

Close it when release Camera:

private void releaseCamera() {
    if (mCamera != null) {
        mCamera.stopPreview();
        mCamera.release();
        mCamera = null;
    }
    if (mAvcEncoder != null) {
        mAvcEncoder.close();
    }
}

解决方案

You're saving a raw H.264 stream. You should convert it to .mp4 format. The easiest way to do this is with the MediaMuxer class (API 18+).

You can find a simple example on bigflake and more complete examples in Grafika.

You will need to provide presentation time stamps for each frame. You can either generate them according to your desired frame rate (like the bigflake example) or acquire them from the source (like the camera-input examples in Grafika).

Edit: For pre-API-18 devices (Android 4.1/4.2), MediaCodec is much more difficult to work with. You can't use Surface input or MediaMuxer, and the lack of platform tests led to some unfortunate incompatibilities. This answer has an overview.

In your specific case, I will note that your sample code is attempting to specify the input format, but that has no effect -- the AVC codec defines what input formats it accepts, and your app must query for it. You will likely find that the colors in your encoded video are currently wrong, as the Camera and MediaCodec don't have any color formats in common (see that answer for color-swap code).

这篇关于在Android的媒体codeC的H.264 AVC视频连接$ C $光盘无法播放的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆