iOS 5 - AVCaptureDevice设置焦点和对焦模式冻结实时相机图片 [英] iOS 5 - AVCaptureDevice setting focus point and focus mode freezes the live camera picture

查看:1128
本文介绍了iOS 5 - AVCaptureDevice设置焦点和对焦模式冻结实时相机图片的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用以下方法设置自iOS 4以来的焦点:

I'm using the following method to set point of focus since iOS 4:

- (void) focusAtPoint:(CGPoint)point
{
    AVCaptureDevice *device = [[self captureInput] device];

    NSError *error;

     if ([device isFocusModeSupported:AVCaptureFocusModeAutoFocus] &&
         [device isFocusPointOfInterestSupported])
     {
         if ([device lockForConfiguration:&error]) {
             [device setFocusPointOfInterest:point];
             [device setFocusMode:AVCaptureFocusModeAutoFocus];
             [device unlockForConfiguration];
         } else {
             NSLog(@"Error: %@", error);
         }
     }
}

在iOS 4设备上,这可行没有任何问题。但是在iOS 5上,实时摄像头会冻结并在几秒钟后完全变黑。没有异常或抛出错误。

On iOS 4 devices this works without any problems. But on iOS 5 the live camera feed freezes and after some seconds gets completely black. There is no exception or error thrown.

如果我注释掉setFocusPointOfInterest或setFocusMode,则不会发生错误。所以它们的组合都会导致这种行为。

The error won't occur if I comment out either setFocusPointOfInterest or setFocusMode. So the combination of them both will lead to this behavior.

推荐答案

你给出setFocusPointOfInterest:函数的点是不正确的。这就是它崩溃的原因。

The point you've given the setFocusPointOfInterest: function is incorrect. It's the reason why it's crashing.

将此方法添加到您的程序并使用此函数返回的值

Add this method to your program and use the value returned by this function

    - (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates 
{
    CGPoint pointOfInterest = CGPointMake(.5f, .5f);
    CGSize frameSize = [[self videoPreviewView] frame].size;

    AVCaptureVideoPreviewLayer *videoPreviewLayer = [self prevLayer];

    if ([[self prevLayer] isMirrored]) {
        viewCoordinates.x = frameSize.width - viewCoordinates.x;
    }    

    if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize] ) {
        pointOfInterest = CGPointMake(viewCoordinates.y / frameSize.height, 1.f - (viewCoordinates.x / frameSize.width));
    } else {
        CGRect cleanAperture;
        for (AVCaptureInputPort *port in [[[[self captureSession] inputs] lastObject] ports]) {
            if ([port mediaType] == AVMediaTypeVideo) {
                cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
                CGSize apertureSize = cleanAperture.size;
                CGPoint point = viewCoordinates;

                CGFloat apertureRatio = apertureSize.height / apertureSize.width;
                CGFloat viewRatio = frameSize.width / frameSize.height;
                CGFloat xc = .5f;
                CGFloat yc = .5f;

                if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect] ) {
                    if (viewRatio > apertureRatio) {
                        CGFloat y2 = frameSize.height;
                        CGFloat x2 = frameSize.height * apertureRatio;
                        CGFloat x1 = frameSize.width;
                        CGFloat blackBar = (x1 - x2) / 2;
                        if (point.x >= blackBar && point.x <= blackBar + x2) {
                            xc = point.y / y2;
                            yc = 1.f - ((point.x - blackBar) / x2);
                        }
                    } else {
                        CGFloat y2 = frameSize.width / apertureRatio;
                        CGFloat y1 = frameSize.height;
                        CGFloat x2 = frameSize.width;
                        CGFloat blackBar = (y1 - y2) / 2;
                        if (point.y >= blackBar && point.y <= blackBar + y2) {
                            xc = ((point.y - blackBar) / y2);
                            yc = 1.f - (point.x / x2);
                        }
                    }
                } else if ([[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
                    if (viewRatio > apertureRatio) {
                        CGFloat y2 = apertureSize.width * (frameSize.width / apertureSize.height);
                        xc = (point.y + ((y2 - frameSize.height) / 2.f)) / y2;
                        yc = (frameSize.width - point.x) / frameSize.width;
                    } else {
                        CGFloat x2 = apertureSize.height * (frameSize.height / apertureSize.width);
                        yc = 1.f - ((point.x + ((x2 - frameSize.width) / 2)) / x2);
                        xc = point.y / frameSize.height;
                    }

                }

                pointOfInterest = CGPointMake(xc, yc);
                break;
            }
        }
    }

    return pointOfInterest;
}

这篇关于iOS 5 - AVCaptureDevice设置焦点和对焦模式冻结实时相机图片的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆