未调用captureOutput [英] captureOutput not being called

查看:336
本文介绍了未调用captureOutput的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我研究这个问题已经太久了.

I have been looking into this for way too long now.

我正在尝试获取MacOS网络摄像头数据并在网络摄像头输出的帧上运行CIDetect.

I am trying to get MacOS webcam data and run CIDetect on the frames that the webcam outputs.

我知道我需要:

  • AVCaptureDevice(作为输入)连接到AVCaptureSession

  • connect AVCaptureDevice (as in input to) into AVCaptureSession

AVCaptureVideoDataOutput(作为输出)连接到AVCaptureSession

connect AVCaptureVideoDataOutput (as an output to) into AVCaptureSession

致电.setSampleBufferDelegate(AVCaptureVideoDataOutputSampleBufferDelegate, DelegateQueue)

由于某种原因,在调用.setSampleBufferDelegate(...)之后(当然,在AVCaptureSession实例上调用.startRunning()之后),我的AVCaptureVideoDataOutputSampleBufferDelegatecaptureOutput也没有被调用.

For some reason, after calling .setSampleBufferDelegate(...) (and of course after calling .startRunning() on the AVCaptureSession instance), my AVCaptureVideoDataOutputSampleBufferDelegate's captureOutput is not being called.

我发现有很多人在网上遇到问题,但是找不到任何解决方案.

I found so many people having trouble with this online, but I was not able to find any solution.

在我看来,这和DispatchQueue有关.

MyDelegate.swift:

class MyDelegate : NSObject {


    var context: CIContext?;
    var detector : CIDetector?;

    override init() {
        context = CIContext();
        detector = CIDetector(ofType: CIDetectorTypeFace, context: context);
        print("set up!");

    }

}
extension MyDelegate : AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
        print("success?");
        var pixelBuffer : CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!;
        var image : CIImage = CIImage(cvPixelBuffer: pixelBuffer);
        var features : [CIFeature] = detector!.features(in: image);
        for feature in features {
            print(feature.type);
            print(feature.bounds);
        }
    }

    func captureOutput(_ : AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
        print("fail?");
    }
}

ViewController.swift:

var captureSession : AVCaptureSession;
var captureDevice : AVCaptureDevice?
var previewLayer : AVCaptureVideoPreviewLayer?

var vdo : AVCaptureVideoDataOutput;

var videoDataOutputQueue : DispatchQueue;

override func viewDidLoad() {
    super.viewDidLoad()

    camera.layer = CALayer()

    // Do any additional setup after loading the view, typically from a nib.
    captureSession.sessionPreset = AVCaptureSessionPresetLow

    // Get all audio and video devices on this machine
    let devices = AVCaptureDevice.devices()

    // Find the FaceTime HD camera object
    for device in devices! {
        print(device)

        // Camera object found and assign it to captureDevice
        if ((device as AnyObject).hasMediaType(AVMediaTypeVideo)) {
            print(device)
            captureDevice = device as? AVCaptureDevice
        }
    }

    if captureDevice != nil {
        do {   
            try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
            // vdo : AVCaptureVideoDataOutput;
            vdo.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: NSNumber(value: kCVPixelFormatType_32BGRA)]

            try captureDevice!.lockForConfiguration()
            captureDevice!.activeVideoMinFrameDuration = CMTimeMake(1, 30)
            captureDevice!.unlockForConfiguration()

            videoDataOutputQueue.sync{
                vdo.setSampleBufferDelegate(
                    MyDelegate,
                    queue: videoDataOutputQueue
                );
                vdo.alwaysDiscardsLateVideoFrames = true
                captureSession.addOutput(vdo)   
                captureSession.startRunning();
            }
        } catch {
            print(AVCaptureSessionErrorKey.description)
        }
    }

viewDidLoad中与AVFoundation有关的所有必需变量已在Viewcontrollerinit()内部实例化.为了清楚起见,我将其省略.

All of the necessary variables inside viewDidLoad relating to AVFoundation have been instantiated inside the Viewcontroller's init(). I've omitted that for clarity.

有什么想法吗?

谢谢!

科维克

-修复了从selfMyDelegate的设置委托.

- Fixed setting delegate from self to MyDelegate.

这就是我初始化videoDataOutputQueue的方法:

And this is how I initialize videoDataOutputQueue:

    videoDataOutputQueue = DispatchQueue(
        label: "VideoDataOutputQueue"   
    );

推荐答案

您在声明所需的示例缓冲区委托方法时犯了一个错误:

You made a mistake in declaration of required sample buffer delegate method:

captureOutput(_:didOutputSampleBuffer:from:).

请检查并确保它是

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)

PS:请注意如何声明该方法的参数.所有参数都带有!"这意味着自动展开.

PS: Pay attention on how parameters of that method are declared. All parameters have '!' which means automatic unwrapping.

这篇关于未调用captureOutput的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆