如何在不重新创建会话的情况下让自动对焦在第二个AVCaptureSession中工作? [英] How can I get autofocus to work in a second AVCaptureSession without recreating the sessions?

查看:153
本文介绍了如何在不重新创建会话的情况下让自动对焦在第二个AVCaptureSession中工作?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我创建第二个AVCaptureSession时,自动对焦无法在第一个AVCaptureSession上运行。要创建的第二个会话是自动对焦的工作,第一个创建的会话不会自动对焦。

Autofocus is not working on the first AVCaptureSession when I create a second AVCaptureSession. The second session to be created is the one where autofocus works and the first created one does not autofocus.

我希望任何一个会话都可以在启动时自动对焦在另一个停止后,两个会话的自动白平衡和自动曝光工作相同。如果您使用下面的示例代码观察日志窗口,您可以看到通过的键值观察消息;但是当顶级会话正在运行时,永远不会改变焦点消息。

I would expect that either session would be able to auto focus when started after the other one is stopped in the same way the auto white balance and auto exposure work for both sessions. If you observe the log window with the sample code below you can see the key-value-observing messages coming through; but never the changing focus message when the top session is running.

旁注:不幸的是我在第三方库中有一个错误,我正在使用它阻止我简单地重新创建这些会话完全在我们之间切换时(它正在泄漏其AVCaptureSessions,最终导致应用程序被杀死)。完整的故事是这个库正在为我创建一个捕获会话,它有一个公共API来启动和停止会话,我希望创建另一个会话。下面的代码演示了问题,但没有使用第三方库。

Sidenote: Unfortunately I have a bug in a third party library that I am using which prevents me from simply recreating the sessions entirely as I switch between them (it is leaking its AVCaptureSessions which eventually cause the app to be killed). The full story is that this library is creating one of the capture sessions for me, it has a public API to start and stop the session and I wish to create another session. The code below demonstrates the problem though without using the third party library.

我创建了一个测试应用程序,其中包含下面列出的代码和一个XIB文件,该文件有两个视图,一个在另一个之上,另一个按钮连接到switchSessions方法演示了这个问题。

I've created a test application with the code listed below and a XIB file that has two views, one above the other and a button hooked up to the switchSessions method that demonstrates the problem.

这可能与此处描述的问题有关,
焦点(自动对焦)不在相机中工作(AVFoundation AVCaptureSession),虽然没有提到两个捕获会话。

It may be related to the problem described here, Focus (Autofocus) not working in camera (AVFoundation AVCaptureSession), although no mention is made of two capture sessions.

头文件:

#import <UIKit/UIKit.h>

@class AVCaptureSession;
@class AVCaptureStillImageOutput;
@class AVCaptureVideoPreviewLayer;
@class AVCaptureDevice;
@class AVCaptureDeviceInput;

@interface AVCaptureSessionFocusBugViewController : UIViewController {

    IBOutlet UIView *_topView;
    IBOutlet UIView *_bottomView;

    AVCaptureDevice *_device;

    AVCaptureSession *_topSession;

    AVCaptureStillImageOutput *_outputTopSession;
    AVCaptureVideoPreviewLayer *_previewLayerTopSession;
    AVCaptureDeviceInput *_inputTopSession;

    AVCaptureSession *_bottomSession;

    AVCaptureStillImageOutput *_outputBottomSession;
    AVCaptureVideoPreviewLayer *_previewLayerBottomSession;
    AVCaptureDeviceInput *_inputBottomSession;
}

- (IBAction)switchSessions:(id)sender;

@end

实施档案:

#import "AVCaptureSessionFocusBugViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface AVCaptureSessionFocusBugViewController ()

- (void)setupCaptureSession:(AVCaptureSession **)session
                     output:(AVCaptureStillImageOutput **)output
               previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
                      input:(AVCaptureDeviceInput **)input
                       view:(UIView *)view;

- (void)tearDownSession:(AVCaptureSession **)session
                 output:(AVCaptureStillImageOutput **)output
           previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
                  input:(AVCaptureDeviceInput **)input
                   view:(UIView *)view;

@end

@implementation AVCaptureSessionFocusBugViewController

- (IBAction)switchSessions:(id)sender
{
    if ([_topSession isRunning]) {
        [_topSession stopRunning];
        [_bottomSession startRunning];
        NSLog(@"Bottom session now running.");
    }
    else {
        [_bottomSession stopRunning];
        [_topSession startRunning];
        NSLog(@"Top session now running.");
    }
}

- (void)observeValueForKeyPath:(NSString *)keyPath 
                      ofObject:(id)object 
                        change:(NSDictionary *)change 
                       context:(void *)context
{
    NSLog(@"Observed value for key at key path %@.", keyPath);
    // Enable to confirm that the focusMode is set correctly.
    //NSLog(@"Autofocus for the device is set to %d.", [_device focusMode]);
}

- (void)viewDidLoad {
    [super viewDidLoad];

    _device = [[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] retain];

    [self setupCaptureSession:&_topSession 
                       output:&_outputTopSession
                 previewLayer:&_previewLayerTopSession
                        input:&_inputTopSession
                         view:_topView];

    [self setupCaptureSession:&_bottomSession 
                       output:&_outputBottomSession
                 previewLayer:&_previewLayerBottomSession
                        input:&_inputBottomSession
                         view:_bottomView];

    // NB: We only need to observe one device, since the top and bottom sessions use the same device.
    [_device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:nil];
    [_device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:nil];
    [_device addObserver:self forKeyPath:@"adjustingWhiteBalance" options:NSKeyValueObservingOptionNew context:nil];

    [_topSession startRunning];
    NSLog(@"Starting top session.");
}


- (void)setupCaptureSession:(AVCaptureSession **)session
                     output:(AVCaptureStillImageOutput **)output
               previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
                      input:(AVCaptureDeviceInput **)input
                       view:(UIView *)view
{    
    *session = [[AVCaptureSession alloc] init];

    // Create the preview layer.
    *previewLayer = [[AVCaptureVideoPreviewLayer layerWithSession:*session] retain];

    [*previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];

    [*previewLayer setFrame:[view bounds]];

    [[view layer] addSublayer:*previewLayer];

    // Configure the inputs and outputs.
    [*session setSessionPreset:AVCaptureSessionPresetMedium];

    NSError *error = nil;

    *input = [[AVCaptureDeviceInput deviceInputWithDevice:_device error:&error] retain];

    if (!*input) {
        NSLog(@"Error creating input device:%@", [error localizedDescription]);
        return;
    }

    [*session addInput:*input];

    *output = [[AVCaptureStillImageOutput alloc] init];

    [*session addOutput:*output];

    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];

    [*output setOutputSettings:outputSettings];

    [outputSettings release];
}

- (void)viewDidUnload {
    [_topView release];
    _topView = nil;

    [_bottomView release];
    _bottomView = nil;

    [_device release];
    _device = nil;

    [self tearDownSession:&_topSession
                   output:&_outputTopSession
             previewLayer:&_previewLayerTopSession
                    input:&_inputTopSession
                     view:_topView];

    [self tearDownSession:&_bottomSession 
                       output:&_outputBottomSession
                 previewLayer:&_previewLayerBottomSession
                        input:&_inputBottomSession
                         view:_bottomView];
}

- (void)tearDownSession:(AVCaptureSession **)session
                 output:(AVCaptureStillImageOutput **)output
           previewLayer:(AVCaptureVideoPreviewLayer **)previewLayer
                  input:(AVCaptureDeviceInput **)input
                   view:(UIView *)view
{
    if ([*session isRunning]) {
        [*session stopRunning];
    }

    [*session removeOutput:*output];

    [*output release];
    *output = nil;

    [*session removeInput:*input];

    [*input release];
    *input = nil;

    [*previewLayer removeFromSuperlayer];

    [*previewLayer release];
    *previewLayer = nil;

    [*session release];
    *session = nil;
}

@end


推荐答案

Apple技术支持已确认不支持创建两个同时捕获会话。你必须拆掉一个,然后创建另一个。

Apple technical support have confirmed that creating two simultaneous capture sessions is not supported. You must teardown one and then create another.

这篇关于如何在不重新创建会话的情况下让自动对焦在第二个AVCaptureSession中工作?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆