无法多路复用器音频和视频 [英] Unable to mux both audio and video

查看:483
本文介绍了无法多路复用器音频和视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在写一个应用程序,记录屏幕捕获和音频使用媒体codeC。我用MediaMuxer到多路复用器的视频和音频打造的MP4文件。我successfuly设法分开来写的视频和音频,但是当我尝试流合并在一起住,结果是出乎意料的。无论是音频播放没有视频或音频之后的视频播放。我的猜测是,我做错了什么时间戳,但我无法弄清楚究竟是什么。我已经看了这些例子:<一href="https://github.com/OnlyInAmerica/HWEn$c$crExperiments/tree/audiotest/HWEn$c$crExperiments/src/main/java/net/openwatch/hwen$c$crexperiments">https://github.com/OnlyInAmerica/HWEn$c$crExperiments/tree/audiotest/HWEn$c$crExperiments/src/main/java/net/openwatch/hwen$c$crexperiments和bigflake.com那些并没有能够找到答案。

I'm writing an app that records screen capture and audio using MediaCodec. I use MediaMuxer to mux video and audio to create mp4 file. I successfuly managed to write video and audio separately, however when I try muxing them together live, the result is unexpected. Either audio is played without video, or video is played right after audio. My guess is that I'm doing something wrong with timestamps, but I can't figure out what exactly. I already looked at those examples: https://github.com/OnlyInAmerica/HWEncoderExperiments/tree/audiotest/HWEncoderExperiments/src/main/java/net/openwatch/hwencoderexperiments and the ones on bigflake.com and was not able to find the answer.

下面是我的媒体格式配置:

Here's my media formats configurations:

    mVideoFormat = createMediaFormat();

    private static MediaFormat createVideoFormat() {
    MediaFormat format = MediaFormat.createVideoFormat(
            Preferences.MIME_TYPE, mScreenWidth, mScreenHeight);
    format.setInteger(MediaFormat.KEY_COLOR_FORMAT,
            MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
    format.setInteger(MediaFormat.KEY_BIT_RATE, Preferences.BIT_RATE);
    format.setInteger(MediaFormat.KEY_FRAME_RATE, Preferences.FRAME_RATE);
    format.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,
            Preferences.IFRAME_INTERVAL);
    return format;
}
    mAudioFormat = createAudioFormat();

    private static MediaFormat createAudioFormat() {
    MediaFormat format = new MediaFormat();
    format.setString(MediaFormat.KEY_MIME, "audio/mp4a-latm");
    format.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
    format.setInteger(MediaFormat.KEY_SAMPLE_RATE, 44100);
    format.setInteger(MediaFormat.KEY_CHANNEL_COUNT, 1);
    format.setInteger(MediaFormat.KEY_BIT_RATE, 64000);
    return format;
}

音频和视频连接codeRS,复用器:

Audio and video encoders, muxer:

      mVideoEncoder = MediaCodec.createEncoderByType(Preferences.MIME_TYPE);
    mVideoEncoder.configure(mVideoFormat, null, null,
            MediaCodec.CONFIGURE_FLAG_ENCODE);
    mInputSurface = new InputSurface(mVideoEncoder.createInputSurface(),
            mSavedEglContext);
    mVideoEncoder.start();
    if (recordAudio){
        audioBufferSize = AudioRecord.getMinBufferSize(44100, AudioFormat.CHANNEL_CONFIGURATION_MONO, 
        AudioFormat.ENCODING_PCM_16BIT);
        mAudioRecorder = new AudioRecord(MediaRecorder.AudioSource.MIC, 44100,
        AudioFormat.CHANNEL_CONFIGURATION_MONO, AudioFormat.ENCODING_PCM_16BIT, audioBufferSize);
        mAudioRecorder.startRecording();

        mAudioEncoder = MediaCodec.createEncoderByType("audio/mp4a-latm");
        mAudioEncoder.configure(mAudioFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
        mAudioEncoder.start();
    }
    try {
        String fileId = String.valueOf(System.currentTimeMillis());
        mMuxer = new MediaMuxer(dir.getPath() + "/Video"
                + fileId + ".mp4",
                MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
    } catch (IOException ioe) {
        throw new RuntimeException("MediaMuxer creation failed", ioe);
    }
    mVideoTrackIndex = -1;
    mAudioTrackIndex = -1;
    mMuxerStarted = false;

我使用它来设置视频时间戳:

I use this to set up video timestamps:

mInputSurface.setPresentationTime(mSurfaceTexture.getTimestamp());
drainVideoEncoder(false);

和这个设置音频时间戳:

And this to set up audio time stamps:

lastQueuedPresentationTimeStampUs = getNextQueuedPresentationTimeStampUs();

if(endOfStream)
    mAudioEncoder.queueInputBuffer(inputBufferIndex, 0, audioBuffer.length, lastQueuedPresentationTimeStampUs, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
 else
     mAudioEncoder.queueInputBuffer(inputBufferIndex, 0, audioBuffer.length, lastQueuedPresentationTimeStampUs, 0);


  mAudioBufferInfo.presentationTimeUs = getNextDeQueuedPresentationTimeStampUs();
  mMuxer.writeSampleData(mAudioTrackIndex, encodedData,
                           mAudioBufferInfo);
  lastDequeuedPresentationTimeStampUs = mAudioBufferInfo.presentationTimeUs;


  private static long getNextQueuedPresentationTimeStampUs(){
    long nextQueuedPresentationTimeStampUs = (lastQueuedPresentationTimeStampUs > lastDequeuedPresentationTimeStampUs) 
            ? (lastQueuedPresentationTimeStampUs + 1) : (lastDequeuedPresentationTimeStampUs + 1);
    Log.i(TAG, "nextQueuedPresentationTimeStampUs: " + nextQueuedPresentationTimeStampUs);
    return nextQueuedPresentationTimeStampUs;
}


private static long getNextDeQueuedPresentationTimeStampUs(){
    Log.i(TAG, "nextDequeuedPresentationTimeStampUs: " + (lastDequeuedPresentationTimeStampUs + 1));
    lastDequeuedPresentationTimeStampUs ++;
    return lastDequeuedPresentationTimeStampUs;
}

我从这个例子<一个把它href="https://github.com/OnlyInAmerica/HWEn$c$crExperiments/blob/audiotest/HWEn$c$crExperiments/src/main/java/net/openwatch/hwen$c$crexperiments/AudioEncodingTest.java">https://github.com/OnlyInAmerica/HWEn$c$crExperiments/blob/audiotest/HWEn$c$crExperiments/src/main/java/net/openwatch/hwen$c$crexperiments/AudioEncodingTest.java为了避免timestampUs XXX&LT; lastTimestampUs XXX错误

I took it from this example https://github.com/OnlyInAmerica/HWEncoderExperiments/blob/audiotest/HWEncoderExperiments/src/main/java/net/openwatch/hwencoderexperiments/AudioEncodingTest.java in order to avoid "timestampUs XXX < lastTimestampUs XXX" error

有人可以帮助我弄清楚这个问题,好吗?

Can someone help me figure out the problem, please?

推荐答案

它看起来像你使用系统提供的时间戳的视频,但对于声音的简单的计数器。除非莫名其妙的视频时间标记被用于种子音频每一帧,它只是没有如上图所示。

It looks like you're using system-provided time stamps for video, but a simple counter for audio. Unless somehow the video timestamp is being used to seed the audio every frame and it's just not shown above.

有关音频和视频同步播放,你需要有上预计将psented同时$ P $音频和视频帧相同的presentation时间戳。

For audio and video to play in sync, you need to have the same presentation time stamp on audio and video frames that are expected to be presented at the same time.

另请参阅本<一个href="http://stackoverflow.com/questions/20972049/how-to-provide-both-audio-data-and-video-data-to-mediamux">related问题。

这篇关于无法多路复用器音频和视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆