可以同时使用 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 吗? [英] Can use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?

查看:32
本文介绍了可以同时使用 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想用我的代码同时录制视频和抓取帧.

I want to record video and grab frames at the same time with my code.

我使用 AVCaptureVideoDataOutput 抓取帧和 AVCaptureMovieFileOutput 进行视频录制.但是不能同时工作,但单独工作并得到错误代码-12780.

I am using AVCaptureVideoDataOutput for grab frames and AVCaptureMovieFileOutput for video recording. But can't work and get the error code -12780 while working at the same time but individual.

我搜索了这个问题,但没有得到答案.有没有人有同样的经历或者解释一下?这真的让我困扰了一段时间.

I searched this problem but get no answer. Did anyone have the same experience or explain? It's really bother me for a while time.

谢谢.

推荐答案

我无法回答提出的具体问题,但我已经成功地使用以下方法同时录制视频和抓帧:

I can't answer the specific question put, but I've been successfully recording video and grabbing frames at the same time using:

  • AVCaptureSessionAVCaptureVideoDataOutput 将帧路由到我自己的代码中
  • AVAssetWriterAVAssetWriterInputAVAssetWriterInputPixelBufferAdaptor 将帧写入 H.264 编码的电影文件
  • AVCaptureSession and AVCaptureVideoDataOutput to route frames into my own code
  • AVAssetWriter, AVAssetWriterInput and AVAssetWriterInputPixelBufferAdaptor to write frames out to an H.264 encoded movie file

那没有调查音频.我最终从捕获会话中获取 CMSampleBuffers,然后将它们推送到像素缓冲区适配器中.

That's without investigating audio. I end up getting CMSampleBuffers from the capture session and then pushing them into the pixel buffer adaptor.

所以我的代码看起来或多或少像,你没有问题的部分略读和忽略范围问题:

so my code looks more or less like, with the bits you're having no problems with skimmed over and ignoring issues of scope:

/* to ensure I'm given incoming CMSampleBuffers */
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc;
AVCaptureDevice *captureDevice = default for video, probably;

AVCaptureDeviceInput *deviceInput = input with device as above, 
                                    and attach it to the session;

AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the
                                   delegate and a suitable dispatch queue affixed.

/* to prepare for output; I'll output 640x480 in H.264, via an asset writer */
NSDictionary *outputSettings =
    [NSDictionary dictionaryWithObjectsAndKeys:

            [NSNumber numberWithInt:640], AVVideoWidthKey,
            [NSNumber numberWithInt:480], AVVideoHeightKey,
            AVVideoCodecH264, AVVideoCodecKey,

            nil];

AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput 
                                   assetWriterInputWithMediaType:AVMediaTypeVideo
                                                  outputSettings:outputSettings];

/* I'm going to push pixel buffers to it, so will need a 
   AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've
   asked the AVCaptureVideDataOutput to supply */
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor =
           [[AVAssetWriterInputPixelBufferAdaptor alloc] 
                initWithAssetWriterInput:assetWriterInput 
                sourcePixelBufferAttributes:
                     [NSDictionary dictionaryWithObjectsAndKeys:
                          [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], 
                           kCVPixelBufferPixelFormatTypeKey,
                     nil]];

/* that's going to go somewhere, I imagine you've got the URL for that sorted,
   so create a suitable asset writer; we'll put our H.264 within the normal
   MPEG4 container */
AVAssetWriter *assetWriter = [[AVAssetWriter alloc]
                                initWithURL:URLFromSomwhere
                                fileType:AVFileTypeMPEG4
                                error:you need to check error conditions,
                                      this example is too lazy];
[assetWriter addInput:assetWriterInput];

/* we need to warn the input to expect real time data incoming, so that it tries
   to avoid being unavailable at inopportune moments */
assetWriterInput.expectsMediaDataInRealTime = YES;

... eventually ...

[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[captureSession startRunning];

... elsewhere ...

- (void)        captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
           fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    // a very dense way to keep track of the time at which this frame
    // occurs relative to the output stream, but it's just an example!
    static int64_t frameNumber = 0;
    if(assetWriterInput.readyForMoreMediaData)
        [pixelBufferAdaptor appendPixelBuffer:imageBuffer
                         withPresentationTime:CMTimeMake(frameNumber, 25)];
    frameNumber++;
}

... and, to stop, ensuring the output file is finished properly ...

[captureSession stopRunning];
[assetWriter finishWriting];

这篇关于可以同时使用 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆