使用AVCaptureVideoDataOutput和AVCaptureAudioDataOutput时的性能问题 [英] Performance issues when using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

查看:375
本文介绍了使用AVCaptureVideoDataOutput和AVCaptureAudioDataOutput时的性能问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我使用AVCaptureVideoDataOutput和AVCaptureAudioDataOutput录制音频+视频时,我遇到了延迟问题。有时视频会阻塞几毫秒,有时音频与视频不同步。

I'm having lag issues when I'm recording audio+video by using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput. Sometimes the video blocks for a few milliseconds, sometimes the audio is not in sync with the video.

我插入了一些日志并观察到我首先获得了大量视频缓冲在captureOutput回调中,一段时间后我得到音频缓冲区(有时我根本没有收到音频缓冲区,结果视频没有声音)。如果我评论处理视频缓冲区的代码,我会毫无问题地获得音频缓冲区。

I inserted some logs and observed that first I get a lot of video buffers in captureOutput callback, and after some time I get the audio buffers(sometimes I don't receive the audio buffers at all, and the resulting video is without sound). If I comment the code that handles the video buffers, I get the audio buffers without problems.

这是我正在使用的代码:

This is the code I'm using:

-(void)initMovieOutput:(AVCaptureSession *)captureSessionLocal
{   
    AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];
    self._videoOutput = dataOutput;
    [dataOutput release];

    self._videoOutput.alwaysDiscardsLateVideoFrames = NO;
    self._videoOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]
                                                             forKey:(id)kCVPixelBufferPixelFormatTypeKey
                                  ];     
    AVCaptureAudioDataOutput *audioOutput =  [[AVCaptureAudioDataOutput alloc] init];
    self._audioOutput = audioOutput;
    [audioOutput release];

    [captureSessionLocal addOutput:self._videoOutput];
    [captureSessionLocal addOutput:self._audioOutput];


    // Setup the queue
    dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
    [self._videoOutput setSampleBufferDelegate:self queue:queue];
    [self._audioOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);
}

这里我设置了作者:

-(BOOL) setupWriter:(NSURL *)videoURL session:(AVCaptureSession *)captureSessionLocal
{
    NSError *error = nil;
    self._videoWriter = [[AVAssetWriter alloc] initWithURL:videoURL fileType:AVFileTypeQuickTimeMovie
                                                         error:&error];
    NSParameterAssert(self._videoWriter);


    // Add video input  
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:640], AVVideoWidthKey,
                                   [NSNumber numberWithInt:480], AVVideoHeightKey,
                                   nil];

    self._videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                         outputSettings:videoSettings];


    NSParameterAssert(self._videoWriterInput);
    self._videoWriterInput.expectsMediaDataInRealTime = YES;
    self._videoWriterInput.transform = [self returnOrientation];

    // Add the audio input
    AudioChannelLayout acl;
    bzero( &acl, sizeof(acl));
    acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;


    NSDictionary* audioOutputSettings = nil;          
    // Both type of audio inputs causes output video file to be corrupted.

        // should work on any device requires more space
        audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:                       
                               [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
                               [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                               [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                               [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,                                      
                               [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                               nil ];

    self._audioWriterInput = [AVAssetWriterInput 
                                        assetWriterInputWithMediaType: AVMediaTypeAudio 
                                        outputSettings: audioOutputSettings ];

    self._audioWriterInput.expectsMediaDataInRealTime = YES;    

    // add input
    [self._videoWriter addInput:_videoWriterInput];
    [self._videoWriter addInput:_audioWriterInput];

    return YES;
}

以下是回调:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{

    if( !CMSampleBufferDataIsReady(sampleBuffer) )
    {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        return;
    }
    if( _videoWriter.status !=  AVAssetWriterStatusCompleted )
    {
        if( _videoWriter.status != AVAssetWriterStatusWriting  )
        {               
            CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
            [_videoWriter startWriting];
            [_videoWriter startSessionAtSourceTime:lastSampleTime];
        }

        if( captureOutput == _videoOutput )
        {
            if( [self._videoWriterInput isReadyForMoreMediaData] )
            {

            [self newVideoSample:sampleBuffer];

            }
        }
        else if( captureOutput == _audioOutput )
        {
            if( [self._audioWriterInput isReadyForMoreMediaData] )
            {

                 [self newAudioSample:sampleBuffer];


            }
        }
    }

}

-(void) newAudioSample:(CMSampleBufferRef)sampleBuffer
{

        if( _videoWriter.status > AVAssetWriterStatusWriting )
        {

            [self NSLogPrint:[NSString stringWithFormat:@"Audio:Warning: writer status is %d", _videoWriter.status]];
            if( _videoWriter.status == AVAssetWriterStatusFailed )
                [self NSLogPrint:[NSString stringWithFormat:@"Audio:Error: %@", _videoWriter.error]];
            return;
        }

        if( ![_audioWriterInput appendSampleBuffer:sampleBuffer] )
            [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to audio input"]];

}

-(void) newVideoSample:(CMSampleBufferRef)sampleBuffer
{
    if( _videoWriter.status > AVAssetWriterStatusWriting )
    {
        [self NSLogPrint:[NSString stringWithFormat:@"Video:Warning: writer status is %d", _videoWriter.status]];
        if( _videoWriter.status == AVAssetWriterStatusFailed )
            [self NSLogPrint:[NSString stringWithFormat:@"Video:Error: %@", _videoWriter.error]];
        return;
    }


    if( ![_videoWriterInput appendSampleBuffer:sampleBuffer] )
        [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to video input"]];
}

我的代码中有什么问题,为什么视频会滞后?
(我在Iphone 4 ios 4.2.1上进行测试)

Is there something wrong in my code, why does the video lag? (I'm testing it on a Iphone 4 ios 4.2.1)

推荐答案

看起来你正在使用串行队列。音频输出队列就在视频输出队列之后。考虑使用并发队列。

It looks like you are using serial queues. The audio Output queue is right after the video output queue. Consider using concurrent queues.

这篇关于使用AVCaptureVideoDataOutput和AVCaptureAudioDataOutput时的性能问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆