AVCaptureDeviceOutput没有调用委托方法captureOutput [英] AVCaptureDeviceOutput not calling delegate method captureOutput

查看:109
本文介绍了AVCaptureDeviceOutput没有调用委托方法captureOutput的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在构建一个iOS应用程序(我的第一个),可以动态处理视频静态帧。为了深入研究,我跟着来自Apple的AV *文档中的示例。

I am building an iOS application (my first) that processes video still frames on the fly. To dive into this, I followed an example from the AV* documentation from Apple.

该过程涉及设置输入(相机)和输出。输出使用委托,在这种情况下是控制器本身(它符合并实现所需的方法)。

The process involves setting up an input (the camera) and an output. The output works with a delegate, which in this case is the controller itself (it conforms and implements the method needed).

我遇到的问题是委托方法永远不会被召唤。下面的代码是控制器的实现,它有几个NSLog。我可以看到已启动消息,但被调用的委托方法从未显示。

The problem I am having is that the delegate method never gets called. The code below is the implementation of the controller and it has a couple of NSLogs. I can see the "started" message, but the "delegate method called" never shows.

此代码全部位于实现AVCaptureVideoDataOutputSampleBufferDelegate协议的控制器内。

This code is all within a controller that implements the "AVCaptureVideoDataOutputSampleBufferDelegate" protocol.

- (void)viewDidLoad {

    [super viewDidLoad];

    // Initialize AV session    
        AVCaptureSession *session = [AVCaptureSession new];

        if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
            [session setSessionPreset:AVCaptureSessionPreset640x480];
        else
            [session setSessionPreset:AVCaptureSessionPresetPhoto];

    // Initialize back camera input
        AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

        NSError *error = nil;

        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:camera error:&error];

        if( [session canAddInput:input] ){
            [session addInput:input];
        }


    // Initialize image output
        AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];

        NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
                                           [NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        [output setVideoSettings:rgbOutputSettings];
        [output setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked (as we process the still image)


        //[output addObserver:self forKeyPath:@"capturingStillImage" options:NSKeyValueObservingOptionNew context:@"AVCaptureStillImageIsCapturingStillImageContext"];

        videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
        [output setSampleBufferDelegate:self queue:videoDataOutputQueue];


        if( [session canAddOutput:output] ){
            [session addOutput:output];
        }

        [[output connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];


    [session startRunning];

    NSLog(@"started");


}


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

        NSLog(@"delegate method called");

        CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];

        self.theImage.image = [UIImage imageWithCGImage: cgImage ];

        CGImageRelease( cgImage );

}

注意:我是以iOS 5.0为目标构建。

Note: I'm building with iOS 5.0 as a target.

修改:

我发现问题虽然要求解决其他问题,但是正是我的代码应该做的事情。我已将该问题的代码逐字复制到空白的xcode应用程序中,将NSLogs添加到 captureOutput 函数中,并且不会调用它。这是配置问题吗?有什么我想念的吗?

I've found a question that, although asking for a solution to a different problem, is doing exactly what my code is supposed to do. I've copied the code from that question verbatim into a blank xcode app, added NSLogs to the captureOutput function and it doesn't get called. Is this a configuration issue? Is there something I'm missing?

推荐答案

你的会话是一个局部变量。其范围仅限于 viewDidLoad 。由于这是一个新项目,我认为可以肯定地说你正在使用ARC。在这种情况下,该对象不会泄漏,因此会像在链接问题中那样继续生存,而编译器将确保在 viewDidLoad 退出之前释放对象。

Your session is a local variable. Its scope is limited to viewDidLoad. Since this is a new project, I assume it's safe to say that you're using ARC. In that case that object won't leak and therefore continue to live as it would have done in the linked question, rather the compiler will ensure the object is deallocated before viewDidLoad exits.

因此你的会话没有运行,因为它不再存在。

Hence your session isn't running because it no longer exists.

(除了: self.theImage.image = ... 是不安全的,因为它执行主队列的UIKit操作;你可能想要 dispatch_async 超过 dispatch_get_main_queue()

(aside: the self.theImage.image = ... is unsafe since it performs a UIKit action of the main queue; you probably want to dispatch_async that over to dispatch_get_main_queue())

因此,样本更正:

@implementation YourViewController
{
     AVCaptureSession *session;
}

- (void)viewDidLoad {

    [super viewDidLoad];

    // Initialize AV session    
        session = [AVCaptureSession new];

        if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
            [session setSessionPreset:AVCaptureSessionPreset640x480];
        else
         /* ... etc ... */
}


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

        NSLog(@"delegate method called");

        CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];

        dispatch_sync(dispatch_get_main_queue(),
        ^{
            self.theImage.image = [UIImage imageWithCGImage: cgImage ];
            CGImageRelease( cgImage );
         });
}

现在大多数人主张在实例变量名称的开头使用下划线但是我为简单起见省略了它。在验证诊断正确后,您可以使用Xcode的内置重构工具来解决这个问题。

Most people advocate using an underscore at the beginning of instance variable names nowadays but I omitted it for simplicity. You can use Xcode's built in refactor tool to fix that up after you've verified that the diagnosis is correct.

我移动了 CGImageRelease 发送到主队列的块内,以确保其生命周期超出其捕获范围,扩展到 UIImage 。我无法立即找到任何文档来确认CoreFoundation对象在块中捕获时会自动延长其生命周期。

I moved the CGImageRelease inside the block sent to the main queue to ensure its lifetime extends beyond its capture into a UIImage. I'm not immediately able to find any documentation to confirm that CoreFoundation objects have their lifetime automatically extended when captured in a block.

这篇关于AVCaptureDeviceOutput没有调用委托方法captureOutput的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆