Xcode 9/Swift 4 AVCaptureMetadataOutput setMetadataObjectTypes使用availableMetadataObjectTypes [英] Xcode 9/Swift 4 AVCaptureMetadataOutput setMetadataObjectTypes use availableMetadataObjectTypes

查看:91
本文介绍了Xcode 9/Swift 4 AVCaptureMetadataOutput setMetadataObjectTypes使用availableMetadataObjectTypes的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

似乎有很多与我遇到的问题类似的问题:

There seems to be a lot of issues similar to what I am experiencing:

AVmetadata使用4 xcode 9快速更改

​​找到了AVCaptureMetadataOutput setMetadataObjectTypes不支持的类型

还有一个与AVFoundation相关的Apple错误:

And there is an Apple bug that deals with AVFoundation:

https://forums.developer.apple.com/thread/86810#259270

但是这些似乎都不是我的答案.
我的代码在swift 3中可以很好地运行,但是在swift 4中只会出错.使用上述链接中的解决方案根本没有任何改变.

But none of those seem to actually be the answer for me.
I have code that runs great in swift 3, but will only error out in swift 4. Using the solutions in the above links results in no change at all.

代码:

import UIKit
import AVFoundation

class BarCodeScanViewController: UIViewController,     AVCaptureMetadataOutputObjectsDelegate {
weak var delegate: FlowControllerDelegate?

var captureSession: AVCaptureSession = AVCaptureSession()
var previewLayer: AVCaptureVideoPreviewLayer = AVCaptureVideoPreviewLayer()


override func viewDidLoad() {
    super.viewDidLoad()

    view.backgroundColor = UIColor.black
    captureSession = AVCaptureSession()

    guard let videoCaptureDevice = AVCaptureDevice.default(for: .video) else { return }
    let videoInput: AVCaptureDeviceInput

    do {
        videoInput = try AVCaptureDeviceInput(device: videoCaptureDevice)
    } catch {
        return
    }

    if (captureSession.canAddInput(videoInput)) {
        captureSession.canAddInput(videoInput)
    } else {
        failed()

        return
    }

    // let captureMetadataOutput = AVCaptureMetadataOutput()
    let metadataOutput = AVCaptureMetadataOutput()

    if captureSession.canAddOutput(metadataOutput) {
        captureSession.addOutput(metadataOutput)

        // Check status of camera permissions
        metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)
//            metadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.upce]
        metadataOutput.metadataObjectTypes = [.ean8, .ean13, .pdf417, .upce]
    } else {
        failed()

        return
    }

    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer.frame = view.layer.bounds
    previewLayer.videoGravity = .resizeAspectFill
    view.layer.addSublayer(previewLayer)

    captureSession.startRunning()
}

func failed() {
    let ac = UIAlertController(title: "Scanning not supported", message: "Your device does not support scanning a code from an item. Please use a device with a camera.", preferredStyle: .alert)
    ac.addAction(UIAlertAction(title: "OK", style: .default))

    present(ac, animated: true)

//        captureSession = nil
}

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)

    if(captureSession.isRunning == false) {
        captureSession.startRunning()
    }
}

override func viewWillDisappear(_ animated: Bool) {
    if captureSession.isRunning == true {
        captureSession.stopRunning()
    }

    super.viewWillDisappear(animated)
}

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!) {
    captureSession.stopRunning()

    if let metatdataObject = metadataObjects.first {
        guard let readableObject = metatdataObject as? AVMetadataMachineReadableCodeObject else { return }
        guard let stringValue = readableObject.stringValue else { return }
        AudioServicesPlaySystemSound(SystemSoundID(kSystemSoundID_Vibrate))
        found(code: stringValue)
    }

    dismiss(animated: true)
}

func found(code: String) {
    print(code)
}

override var prefersStatusBarHidden: Bool {
    return true
}

override var supportedInterfaceOrientations: UIInterfaceOrientationMask     {
    return .portrait
}
}

当我在Xcode 8和Swift 3中构建此代码时,它可以正常工作.当我在Xcode 9中运行它时,Swift 4it在添加媒体类型时崩溃:

When I build this code in Xcode 8 and swift 3 it works fine. When I run it in Xcode 9 swift 4it crashes at adding the media types:

metadataOutput.metadataObjectTypes = [.ean8, .ean13, .pdf417, .upce]

在这两种情况下,我都将构建到以前没有Beta的iOS 11设备.

In both cases I am building to an iOS 11 device that did not have the beta on it previously.

我尝试使用"__"来查看是否是上面提到的Apple bug.如果我将代码行注释掉,则代码会运行,但是没有捕获.

I have tried the "__" to see if it was the Apple bug metioned above. If I comment the line out the code runs but there is not capture.

苹果可能还会引入其他错误吗?还有其他人遇到这个问题吗?

Is there some other bug Apple introduced maybe? Anyone else having this issue?

任何帮助将不胜感激.

谢谢

推荐答案

其他有关清晰度的信息:

Leevi Graham是正确的,也确实是苹果在没有适当文档的情况下更改了堆栈.这导致看起来好像有一个错误.

Leevi Graham is correct as well as it being true that Apple changed the stack without proper documentation. This results in it seeming like there is a bug.

条码为swift 4

澄清对我有帮助的评论

代表回叫已从以下方式更改:

The delegate call back has changed from:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!)

func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection)

但是,我真正遇到的问题是您不再设置一长串类型来获取您的 metadataObjectTypes .现在,您只需设置所有可用的类型:

HOWEVER, the real problem I was having was that you no longer set a long array of types to get your metadataObjectTypes. You now just set for all available types:

metadataOutput.metadataObjectTypes = 
metadataOutput.availableMetadataObjectTypes

所以...

这实际上是一个API问题.为此提出了若干雷达问题.但是苹果已经改变了他们的AVFoundation文档来解决这个问题.

This is, in fact an API issue. Several radar issues were filed for it. But Apple has Kindly changed their AVFoundation docs to address the issue.

这篇关于Xcode 9/Swift 4 AVCaptureMetadataOutput setMetadataObjectTypes使用availableMetadataObjectTypes的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆