对焦(自动对焦)在相机中不工作(AVFoundation AVCaptureSession) [英] Focus (Autofocus) not working in camera (AVFoundation AVCaptureSession)

查看:659
本文介绍了对焦(自动对焦)在相机中不工作(AVFoundation AVCaptureSession)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用标准的AVFoundation类来捕获视频并显示预览( http://developer.apple.com/library/ios/#qa/qa1702/_index.html

I am using standard AVFoundation classes to capture video and show preview (http://developer.apple.com/library/ios/#qa/qa1702/_index.html)

这是我的代码:

- (void)setupCaptureSession {       
    NSError *error = nil;

    [self setCaptureSession: [[AVCaptureSession alloc] init]]; 

    self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
        [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
        [device unlockForConfiguration];
    }

    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device 
                                                                        error:&error];
    if (!input) {
        // TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
    }
    [[self captureSession] addInput:input];

    AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
    [[self captureSession] addOutput:output];

    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    output.videoSettings = 
    [NSDictionary dictionaryWithObject:
     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                forKey:(id)kCVPixelBufferPixelFormatTypeKey];


    output.minFrameDuration = CMTimeMake(1, 15);

    [[self captureSession] startRunning];

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    captureVideoPreviewLayer.frame = previewLayer.bounds;
    [previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
    [previewLayer setHidden:NO];

    mutex = YES;
}

// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection { 
    if (mutex && ![device isAdjustingFocus] && ![device isAdjustingExposure] && ![device isAdjustingWhiteBalance]) {
        // something
    }
}

// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
                                                 bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Free up the context and color space
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);

    // Create an image object from the Quartz image
    UIImage *image = [UIImage imageWithCGImage:quartzImage];

    // Release the Quartz image
    CGImageRelease(quartzImage);

    return (image);
}

一切正常,但有时会有一些问题:

Everything is working OK but sometimes it have some issues:


  • 相机焦点不工作 - 它的随机,有时工作,有时不工作。我试过在不同的设备,iPhone 4和3GS。我试着google它,但没有结果。人们只提到破碎的设备,但我查了3 iPhone 4和一个iPhone 3GS。问题无处不在。

  • 相机加载时间相当长。我使用ScannerKit API,它也使用相机的原因相同,它的加载速度比我的实现快两倍。

任何想法什么可能是一个问题?第一个问题显然更重要。

Any ideas what can be a problem? The first issue is definitely more important.

推荐答案

老问题,但无论如何可以节省一些人的挫折。在调用 setFocusMode 之前设置感兴趣的点很重要,否则您的相机会将焦点设置为前一个对焦点。将 setFocusMode 视为 COMMIT 。这同样适用于 setExposureMode

Old question but anyway may save somebody hours of frustration. It's important to set the point of interest before calling setFocusMode, otherwise your camera will set focus to the previous focus point. Think of setFocusMode as COMMIT. Same applies to setExposureMode.

苹果公司的AVCam示例是完全错误和破碎的。

AVCam sample by Apple is totally wrong and broken.

这篇关于对焦(自动对焦)在相机中不工作(AVFoundation AVCaptureSession)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆