未调用AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput [英] AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput not called
问题描述
我目前有一个自行开发的框架(MySDK)和一个使用MySDK的iOS应用(MyApp).
I currently have a self-developed framework (MySDK), and an iOS app (MyApp) that uses MySDK.
在MySDK内,我有一个MySDK类(扫描仪),用于处理设备相机视频输出中的图像.
Inside of MySDK, I have a class (Scanner) in MySDK that processes images from the video output of the device camera.
这是我的代码示例:
Scanner.swift
class Scanner: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {
var captureDevice : AVCaptureDevice?
var captureOutput : AVCaptureVideoDataOutput?
var previewLayer : AVCaptureVideoPreviewLayer?
var captureSession : AVCaptureSession?
var rootViewController : UIViewController?
func scanImage (viewController: UIViewController)
{
NSLog("%@", "scanning begins!")
if (captureSession == nil) { captureSession = AVCaptureSession() }
rootViewController = viewController;
captureSession!.sessionPreset = AVCaptureSessionPresetHigh
let devices = AVCaptureDevice.devices()
for device in devices {
if (device.hasMediaType(AVMediaTypeVideo)) {
if(device.position == AVCaptureDevicePosition.Back) {
captureDevice = device as? AVCaptureDevice
}
}
}
if (captureDevice != nil) {
NSLog("%@", "beginning session!")
beginSession()
}
}
func beginSession()
{
if (captureSession == nil) { captureSession = AVCaptureSession() }
if (captureOutput == nil) { captureOutput = AVCaptureVideoDataOutput() }
if (previewLayer == nil) { previewLayer = AVCaptureVideoPreviewLayer() }
let queue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL);
captureOutput!.setSampleBufferDelegate(self, queue: queue)
captureOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as NSString: Int(kCVPixelFormatType_32BGRA)]
captureSession!.addInput(try! AVCaptureDeviceInput(device: captureDevice))
captureSession!.addOutput(captureOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer!.frame = rootViewController!.view.layer.frame
rootViewController!.view.layer.addSublayer(previewLayer!)
captureSession!.startRunning()
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef!, fromConnection connection: AVCaptureConnection!)
{
NSLog("%@", "captured!")
}
}
在MyApp内部,我有一个ViewController,它实现了 IBAction
,在其中初始化了Scanner类,并触发了 scanImage
函数.
Inside MyApp, I have a ViewController which implements an IBAction
, in which the Scanner class is initialized, and the scanImage
function is triggered.
MyApp.m :
- (IBAction)btnScanImage_TouchDown:(id)sender
{
Scanner * scanner = [[Scanner alloc] init];
[scanner scanImage:self];
}
相机视图出现在应用程序内部,但从不触发captureOutput函数,并且控制台仅包含以下两行:
The camera view comes up inside of the app, but the captureOutput function is never fired, and the console only contains these two lines:
2016-03-07 11:11:45.860 myapp[1236:337377] scanning begins!
2016-03-07 11:11:45.984 myapp[1236:337377] beginning session!
创建一个独立的应用程序,并将 Scanner.swift 中的代码嵌入到ViewController中就可以了;captureOutput函数会正确触发.
Creating a standalone app, and embedding the code inside Scanner.swift into a ViewController works just fine; the captureOutput function fires properly.
有人知道我在做什么错吗?
Does anyone have any idea what I am doing wrong here?
推荐答案
经过反复尝试,终于找到了解决问题的方法.
After much trial and error, I have finally found a solution to my problem.
显然,我没有将 Scanner
对象创建为 class 变量,而没有将其创建为 local 变量.
Apparently, I was not creating the Scanner
object as a class variable, only as a local variable.
将 Scanner
对象创建为 class 变量后,将正确触发委托方法 captureOutput
.
Once the Scanner
object was created as a class variable, the delegate method captureOutput
was fired properly.
这篇关于未调用AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!