可以同时使用AVCaptureVideoDataOutput和AVCaptureMovieFileOutput吗? [英] Can use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time?
问题描述
我想用我的代码同时录制视频和抓取帧。
I want to record video and grab frames at the same time with my code.
我正在使用 AVCaptureVideoDataOutput
获取抓取框架和 AVCaptureMovieFileOutput
用于视频录制。但是无法工作并且在同时工作时获得错误代码-12780但个人。
I am using AVCaptureVideoDataOutput
for grab frames and AVCaptureMovieFileOutput
for video recording. But can't work and get the error code -12780 while working at the same time but individual.
我搜索了这个问题但得不到答案。有没有人有相同的经历或解释?
一段时间我真的很烦。
I searched this problem but get no answer. Did anyone have the same experience or explain? It's really bother me for a while time.
谢谢。
推荐答案
我无法回答具体问题放,但我已成功录制视频并同时使用以下框架抓取框架:
I can't answer the specific question put, but I've been successfully recording video and grabbing frames at the same time using:
-
AVCaptureSession
和AVCaptureVideoDataOutput
将帧路由到我自己的代码 -
AVAssetWriter
,AVAssetWriterInput
和AVAssetWriterInputPixelBufferAdaptor
将帧写入H.264编码的电影文件
AVCaptureSession
andAVCaptureVideoDataOutput
to route frames into my own codeAVAssetWriter
,AVAssetWriterInput
andAVAssetWriterInputPixelBufferAdaptor
to write frames out to an H.264 encoded movie file
没有调查音频。我最终从捕获会话中获取 CMSampleBuffers
,然后将它们推入像素缓冲适配器。
That's without investigating audio. I end up getting CMSampleBuffers
from the capture session and then pushing them into the pixel buffer adaptor.
编辑:所以我的代码看起来或多或少都是这样的,你没有遇到过任何问题,并且忽略了范围问题:
so my code looks more or less like, with the bits you're having no problems with skimmed over and ignoring issues of scope:
/* to ensure I'm given incoming CMSampleBuffers */
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc;
AVCaptureDevice *captureDevice = default for video, probably;
AVCaptureDeviceInput *deviceInput = input with device as above,
and attach it to the session;
AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the
delegate and a suitable dispatch queue affixed.
/* to prepare for output; I'll output 640x480 in H.264, via an asset writer */
NSDictionary *outputSettings =
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
AVVideoCodecH264, AVVideoCodecKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:outputSettings];
/* I'm going to push pixel buffers to it, so will need a
AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've
asked the AVCaptureVideDataOutput to supply */
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor =
[[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:assetWriterInput
sourcePixelBufferAttributes:
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,
nil]];
/* that's going to go somewhere, I imagine you've got the URL for that sorted,
so create a suitable asset writer; we'll put our H.264 within the normal
MPEG4 container */
AVAssetWriter *assetWriter = [[AVAssetWriter alloc]
initWithURL:URLFromSomwhere
fileType:AVFileTypeMPEG4
error:you need to check error conditions,
this example is too lazy];
[assetWriter addInput:assetWriterInput];
/* we need to warn the input to expect real time data incoming, so that it tries
to avoid being unavailable at inopportune moments */
assetWriterInput.expectsMediaDataInRealTime = YES;
... eventually ...
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[captureSession startRunning];
... elsewhere ...
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// a very dense way to keep track of the time at which this frame
// occurs relative to the output stream, but it's just an example!
static int64_t frameNumber = 0;
if(assetWriterInput.readyForMoreMediaData)
[pixelBufferAdaptor appendPixelBuffer:imageBuffer
withPresentationTime:CMTimeMake(frameNumber, 25)];
frameNumber++;
}
... and, to stop, ensuring the output file is finished properly ...
[captureSession stopRunning];
[assetWriter finishWriting];
这篇关于可以同时使用AVCaptureVideoDataOutput和AVCaptureMovieFileOutput吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!