AVCapture 在 iOS 7 中以 60 fps 捕获和获取帧缓冲区 [英] AVCapture capturing and getting framebuffer at 60 fps in iOS 7

查看:14
本文介绍了AVCapture 在 iOS 7 中以 60 fps 捕获和获取帧缓冲区的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一个需要以尽可能高的 fps 捕获帧缓冲区的应用程序.我已经想出了如何强制 iphone 以 60 fps 的速度拍摄,但是

I'm developping an app which requires capturing framebuffer at as much fps as possible. I've already figured out how to force iphone to capture at 60 fps but

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection

该方法每秒仅被调用 15 次,这意味着 iPhone 将捕获输出降级为 15 fps.

method is being called only 15 times a second, which means that iPhone downgrades capture output to 15 fps.

有人遇到过这样的问题吗?有没有可能提高捕获帧率?

Has anybody faced such problem? Is there any possibility to increase capturing frame rate?

更新我的代码:

camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if([camera isTorchModeSupported:AVCaptureTorchModeOn]) {
   [camera lockForConfiguration:nil];
   camera.torchMode=AVCaptureTorchModeOn;
   [camera unlockForConfiguration];
}
[self configureCameraForHighestFrameRate:camera];

// Create a AVCaptureInput with the camera device
NSError *error=nil;
AVCaptureInput* cameraInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
if (cameraInput == nil) {
   NSLog(@"Error to create camera capture:%@",error);
}

// Set the output
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];

// create a queue to run the capture on
dispatch_queue_t captureQueue=dispatch_queue_create("captureQueue", NULL);

// setup our delegate
[videoOutput setSampleBufferDelegate:self queue:captureQueue];

// configure the pixel format
videoOutput.videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey,
                             nil];

// Add the input and output
[captureSession addInput:cameraInput];
[captureSession addOutput:videoOutput];

我在这里采用 configureCameraForHighestFrameRate 方法 https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html

I took configureCameraForHighestFrameRate method here https://developer.apple.com/library/mac/documentation/AVFoundation/Reference/AVCaptureDevice_Class/Reference/Reference.html

推荐答案

在 captureOutput 中进行实时运动检测和保存帧时,我在 iPhone 5 上以 60 fps 和在 iPhone 5s 上以 120 fps 获取样本使用 AVAssetWriter 到视频.

I am getting samples at 60 fps on the iPhone 5 and 120 fps on the iPhone 5s, both when doing real time motion detection in captureOutput and when saving the frames to a video using AVAssetWriter.

您必须将 AVCaptureSession 设置为支持 60 fps 的格式:

You have to set thew AVCaptureSession to a format that supports 60 fps:

AVsession = [[AVCaptureSession alloc] init];

AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *capInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (capInput) [AVsession addInput:capInput];

for(AVCaptureDeviceFormat *vFormat in [videoDevice formats] ) 
{
    CMFormatDescriptionRef description= vFormat.formatDescription;
    float maxrate=((AVFrameRateRange*)[vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;

    if(maxrate>59 && CMFormatDescriptionGetMediaSubType(description)==kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
    {
        if ( YES == [videoDevice lockForConfiguration:NULL] ) 
        {
           videoDevice.activeFormat = vFormat;
           [videoDevice setActiveVideoMinFrameDuration:CMTimeMake(10,600)];
           [videoDevice setActiveVideoMaxFrameDuration:CMTimeMake(10,600)];
           [videoDevice unlockForConfiguration];
           NSLog(@"formats  %@ %@ %@",vFormat.mediaType,vFormat.formatDescription,vFormat.videoSupportedFrameRateRanges);
        }
     }
}

prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: AVsession];
prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: prevLayer];

AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
[videoOut setSampleBufferDelegate:self queue:videoQueue];

videoOut.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
videoOut.alwaysDiscardsLateVideoFrames=YES;

if (videoOut)
{
    [AVsession addOutput:videoOut];
    videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
}

如果您想使用 AVAssetWriter 写入文件,还有两个注释.不要使用 pixelAdaptor,只需使用

Two other comment if you want to write to a file using AVAssetWriter. Don't use the pixelAdaptor, just ad the samples with

[videoWriterInput appendSampleBuffer:sampleBuffer]

其次是设置assetwriter使用时

Secondly when setting up the assetwriter use

[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                   outputSettings:videoSettings 
                                 sourceFormatHint:formatDescription];

sourceFormatHint 会影响写入速度.

The sourceFormatHint makes a difference in writing speed.

这篇关于AVCapture 在 iOS 7 中以 60 fps 捕获和获取帧缓冲区的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆