使用ARKit 2检测物体时显示边界框 [英] Show bounding box while detecting object using ARKit 2

查看:114
本文介绍了使用ARKit 2检测物体时显示边界框的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经扫描并训练了多个现实世界的物体.我确实有ARReferenceObject并且应用程序可以很好地检测到它们.

I have scanned and trained multiple real world objects. I do have the ARReferenceObject and the app detects them fine.

我面临的问题是,当对象不具有<​​em>明显,充满活力的特征时,它需要几秒钟的时间才能返回检测结果,据我所知.现在,我希望该应用在尝试检测该对象时在该对象的顶部显示一个边界框和一个活动指示器.

The issue that I'm facing is when an object doest not have distinct, vibrant features it takes few seconds to return a detection result, which I can understand. Now, I want the app to show a bounding box and an activity indicator on top the object while it is trying to detect it.

我没有有关此的任何信息.另外,如果有任何方法可以获取开始检测的时间或被检测对象的置信度百分比.

I do not see any information regarding this. Also, if there is any way to get the time when detection starts or the confidence percentage of the object being detected.

感谢您的帮助.

推荐答案

可以在检测到ARReferenceObject之前显示有关boundingBox的信息.尽管我不确定您为什么要(无论如何要提前这样做).

It is possible to show a boundingBox in regard to the ARReferenceObject prior to it being detected; although I am not sure why you would want to do that (in advance anyway).

例如,假设您的referenceObject在水平表面上,则首先需要将估计的边界框放置在平面上(或提前使用其他方法放置),并在检测ARPlaneAnchor时花费时间并放置boundingBox,很可能已经检测到您的模型.

For example, assuming your referenceObject was on a horizontal surface you would first need to place your estimated bounding box on the plane (or use some other method to place it in advance), and in the time it took to detect the ARPlaneAnchor and place the boundingBox it is most likely that your model would already have been detected.

可能的方法:

毫无疑问,您知道ARReferenceObject具有centerextentscale属性以及与该对象关联的一组rawFeaturePoints.

As you are no doubt aware an ARReferenceObject has a center, extent and scale property as well as a set of rawFeaturePoints associated with the object.

这样,我们可以基于Apple在扫描中获得的一些示例代码来创建自己的boundingBox节点. &检测3D对象并创建我们自己的SCNNode,它将显示一个ARReferenceObject大小近似的边界框,该边界框在被检测到之前就已存储在本地.

As such we can create our own boundingBox node based on some of the sample code from Apple in Scanning & Detecting 3D Objects and create our own SCNNode which will display a bounding box of the approximate size of the ARReferenceObject which is stored locally prior to it being detected.

请注意,您需要从Apple示例代码中找到"wireframe_shader",使boundingBox呈现透明:

Note you will need to locate the 'wireframe_shader' from the Apple Sample Code for the boundingBox to render transparent:

import Foundation
import ARKit
import SceneKit

class BlackMirrorzBoundingBox: SCNNode {

    //-----------------------
    // MARK: - Initialization
    //-----------------------

    /// Creates A WireFrame Bounding Box From The Data Retrieved From The ARReferenceObject
    ///
    /// - Parameters:
    ///   - points: [float3]
    ///   - scale: CGFloat
    ///   - color: UIColor
    init(points: [float3], scale: CGFloat, color: UIColor = .cyan) {
        super.init()

        var localMin = float3(Float.greatestFiniteMagnitude)
        var localMax = float3(-Float.greatestFiniteMagnitude)

        for point in points {
            localMin = min(localMin, point)
            localMax = max(localMax, point)
        }

        self.simdPosition += (localMax + localMin) / 2
        let extent = localMax - localMin

        let wireFrame = SCNNode()
        let box = SCNBox(width: CGFloat(extent.x), height: CGFloat(extent.y), length: CGFloat(extent.z), chamferRadius: 0)
        box.firstMaterial?.diffuse.contents = color
        box.firstMaterial?.isDoubleSided = true
        wireFrame.geometry = box
        setupShaderOnGeometry(box)
        self.addChildNode(wireFrame)
    }

    required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) Has Not Been Implemented") }

    //----------------
    // MARK: - Shaders
    //----------------

    /// Sets A Shader To Render The Cube As A Wireframe
    ///
    /// - Parameter geometry: SCNBox
    func setupShaderOnGeometry(_ geometry: SCNBox) {
        guard let path = Bundle.main.path(forResource: "wireframe_shader", ofType: "metal", inDirectory: "art.scnassets"),
            let shader = try? String(contentsOfFile: path, encoding: .utf8) else {

                return
        }

        geometry.firstMaterial?.shaderModifiers = [.surface: shader]
    }

}

要显示边界框,您将执行以下操作,请注意,在我的示例中,我具有以下变量:

To display the bounding box you you would then do something like the following, noting that in my example I have the following variables:

 @IBOutlet var augmentedRealityView: ARSCNView!
 let configuration = ARWorldTrackingConfiguration()
 let augmentedRealitySession = ARSession()

要在检测到实际对象本身之前显示boundingBox,请在viewDidLoad中调用func loadBoundigBox,例如:

To display the boundingBox prior to detection of the actual object itself, you would call the func loadBoundigBox in viewDidLoad e.g:

/// Creates A Bounding Box From The Data Available From The ARObject In The Local Bundle
func loadBoundingBox(){

    //1. Run Our Session
    augmentedRealityView.session = augmentedRealitySession
    augmentedRealityView.delegate = self

    //2. Load A Single ARReferenceObject From The Main Bundle
    if let objectURL =  Bundle.main.url(forResource: "fox", withExtension: ".arobject"){

        do{
            var referenceObjects = [ARReferenceObject]()
            let object = try ARReferenceObject(archiveURL: objectURL)

            //3. Log it's Properties
            print("""
                Object Center = \(object.center)
                Object Extent = \(object.extent)
                Object Scale = \(object.scale)
                """)

            //4. Get It's Scale
            let scale = CGFloat(object.scale.x)

            //5. Create A Bounding Box
            let boundingBoxNode = BlackMirrorzBoundingBox(points: object.rawFeaturePoints.points, scale: scale)

            //6. Add It To The ARSCNView
            self.augmentedRealityView.scene.rootNode.addChildNode(boundingBoxNode)

            //7. Position It 0.5m Away From The Camera
            boundingBoxNode.position = SCNVector3(0, -0.5, -0.5)

            //8. Add It To The Configuration
            referenceObjects.append(object)
            configuration.detectionObjects = Set(referenceObjects)

        }catch{
            print(error)
        }

    }

    //9. Run The Session
    augmentedRealitySession.run(configuration, options: [.resetTracking, .removeExistingAnchors])
    augmentedRealityView.automaticallyUpdatesLighting = true
}

上面的示例简单地从未检测到的ARReferenceObject创建了一个boundingBox,并将其放置在距Camera 0.5m处且距Camera 0.5米的地方,从而产生如下结果:

The above example simple creates a boundingBox from the non-detected ARReferenceObject and places it 0.5m down from and 0.5meter away from the Camera which yields something like this:

您当然首先需要处理boundBox的位置,还需要处理boundingBox'indicator'的删除.

You would of course need to handle the position of the boundBox initially, as well as hoe to handle the removal of the boundingBox 'indicator'.

当检测到实际对象时,下面的方法仅显示一个boundBox:

The method below simply shows a boundBox when the actual object is detected e.g:

//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------

extension ViewController: ARSCNViewDelegate{

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        //1. Check We Have A Valid ARObject Anchor 
        guard let objectAnchor = anchor as? ARObjectAnchor else { return }

        //2. Create A Bounding Box Around Our Object
        let scale = CGFloat(objectAnchor.referenceObject.scale.x)
        let boundingBoxNode = BlackMirrorzBoundingBox(points: objectAnchor.referenceObject.rawFeaturePoints.points, scale: scale)
        node.addChildNode(boundingBoxNode)

    }

}

产生如下内容:

关于检测计时器,Apple示例代码中有一个示例,其中显示了检测模型需要多长时间.

In regard to the detection timer, there is an example in the Apple Sample Code, which displays how long it takes to detect the model.

以最原始的形式(不计毫秒),您可以执行以下操作:

In its crudest form (not accounting for milliseconds) you can do something like so:

首先创建一个Timer和一个var来存储检测时间,例如:

Firstly create A Timer and a var to store the detection time e.g:

var detectionTimer = Timer()

var detectionTime: Int = 0

然后,当您运行ARSessionConfiguration时,初始化计时器,例如:

Then when you run your ARSessionConfiguration initialise the timer e.g:

/// Starts The Detection Timer
func startDetectionTimer(){

     detectionTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(logDetectionTime), userInfo: nil, repeats: true)
}

/// Increments The Total Detection Time Before The ARReference Object Is Detected
@objc func logDetectionTime(){
    detectionTime += 1

}

然后在检测到ARReferenceObject时使计时器无效并记录时间,例如:

Then when an ARReferenceObject has been detected invalidate the timer and log the time e.g:

//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------

extension ViewController: ARSCNViewDelegate{

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        //1. Check We Have A Valid ARObject Anchor
        guard let _ = anchor as? ARObjectAnchor else { return }

        //2. Stop The Timer
        detectionTimer.invalidate()

        //3. Log The Detection Time
        print("Total Detection Time = \(detectionTime) Seconds")

        //4. Reset The Detection Time
        detectionTime = 0

    }

}

这应该足以让您开始...

This should be more than enough to get your started...

并且请注意,此示例在扫描对象时不提供boundingBox(请参阅Apple示例代码),它基于问题中所隐含的现有ARReferenceObject提供了一个boundingBox.我的解释正确).

这篇关于使用ARKit 2检测物体时显示边界框的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆