使用 ARKit 2 检测对象时显示边界框 [英] Show bounding box while detecting object using ARKit 2

查看:27
本文介绍了使用 ARKit 2 检测对象时显示边界框的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经扫描并训练了多个真实世界的物体.我确实有 ARReferenceObject 并且应用程序可以很好地检测到它们.

我面临的问题是,当一个物体没有独特鲜明的特征时,返回检测结果需要几秒钟的时间,我可以理解.现在,我希望应用程序在尝试检测对象时在对象顶部显示一个边界框和一个活动指示器.

我没有看到任何关于此的信息.另外,如果有什么方法可以得到检测开始的时间或被检测对象的置信度百分比.

感谢任何帮助.

解决方案

可以在检测到 ARReferenceObject 之前显示一个 boundingBox;虽然我不确定你为什么要这样做(无论如何都是提前).

例如,假设您的 referenceObject 位于水平表面上,您首先需要将估计的边界框放置在平面上(或使用其他方法提前放置),并在检测 ARPlaneAnchor 所需的时间内并放置 boundingBox 很可能已经检测到您的模型.

可能的方法:

你肯定知道 ARReferenceObject 有一个 centerextentscale 属性以及一组与对象关联的 rawFeaturePoints.

因此,我们可以根据 Apple 在

您当然需要最初处理 boundBox 的位置,以及处理 boundingBox 'indicator' 的移除.

下面的方法只是在检测到实际对象时显示一个 boundBox,例如:

//---------------------------//标记:- ARSCNViewDelegate//-------------扩展视图控制器:ARSCNViewDelegate{func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {//1.检查我们有一个有效的 ARObject 锚点守卫让 objectAnchor = 锚定为?ARObjectAnchor else { return }//2.在我们的对象周围创建一个边界框让比例 = CGFloat(objectAnchor.referenceObject.scale.x)让 boundingBoxNode = BlackMirrorzBoundingBox(points: objectAnchor.referenceObject.rawFeaturePoints.points, scale: scale)node.addChildNode(boundingBoxNode)}}

产生如下结果:

关于检测计时器,Apple Sample Code中有一个例子,显示检测模型需要多长时间.

以最原始的形式(不考虑毫秒),您可以执行以下操作:

首先创建一个 Timer 和一个 var 来存储检测时间,例如:

var detectionTimer = Timer()var检测时间:Int = 0

然后当您运行 ARSessionConfiguration 时初始化计时器,例如:

////启动检测定时器func startDetectionTimer(){detectionTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(logDetectionTime), userInfo: nil, repeats: true)}///在检测到 ARReference 对象之前增加总检测时间@objc func logDetectionTime(){检测时间 += 1}

然后当检测到 ARReferenceObject 时,使计时器无效并记录时间,例如:

//---------------------------//标记:- ARSCNViewDelegate//-------------扩展视图控制器:ARSCNViewDelegate{func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {//1.检查我们有一个有效的 ARObject 锚点守卫让 _ = 锚定为?ARObjectAnchor else { return }//2.停止计时器detectionTimer.invalidate()//3.记录检测时间打印(总检测时间 = \(检测时间)秒")//4.重置检测时间检测时间 = 0}}

这应该足以让您开始......

并且请注意,此示例在扫描对象时不提供 boundingBox(请查看 Apple 示例代码),它提供了一个基于现有 ARReferenceObject 的对象,该对象隐含在您的问题中(假设我解释正确).

I have scanned and trained multiple real world objects. I do have the ARReferenceObject and the app detects them fine.

The issue that I'm facing is when an object doest not have distinct, vibrant features it takes few seconds to return a detection result, which I can understand. Now, I want the app to show a bounding box and an activity indicator on top the object while it is trying to detect it.

I do not see any information regarding this. Also, if there is any way to get the time when detection starts or the confidence percentage of the object being detected.

Any help is appreciated.

解决方案

It is possible to show a boundingBox in regard to the ARReferenceObject prior to it being detected; although I am not sure why you would want to do that (in advance anyway).

For example, assuming your referenceObject was on a horizontal surface you would first need to place your estimated bounding box on the plane (or use some other method to place it in advance), and in the time it took to detect the ARPlaneAnchor and place the boundingBox it is most likely that your model would already have been detected.

Possible Approach:

As you are no doubt aware an ARReferenceObject has a center, extent and scale property as well as a set of rawFeaturePoints associated with the object.

As such we can create our own boundingBox node based on some of the sample code from Apple in Scanning & Detecting 3D Objects and create our own SCNNode which will display a bounding box of the approximate size of the ARReferenceObject which is stored locally prior to it being detected.

Note you will need to locate the 'wireframe_shader' from the Apple Sample Code for the boundingBox to render transparent:

import Foundation
import ARKit
import SceneKit

class BlackMirrorzBoundingBox: SCNNode {

    //-----------------------
    // MARK: - Initialization
    //-----------------------

    /// Creates A WireFrame Bounding Box From The Data Retrieved From The ARReferenceObject
    ///
    /// - Parameters:
    ///   - points: [float3]
    ///   - scale: CGFloat
    ///   - color: UIColor
    init(points: [float3], scale: CGFloat, color: UIColor = .cyan) {
        super.init()

        var localMin = float3(Float.greatestFiniteMagnitude)
        var localMax = float3(-Float.greatestFiniteMagnitude)

        for point in points {
            localMin = min(localMin, point)
            localMax = max(localMax, point)
        }

        self.simdPosition += (localMax + localMin) / 2
        let extent = localMax - localMin

        let wireFrame = SCNNode()
        let box = SCNBox(width: CGFloat(extent.x), height: CGFloat(extent.y), length: CGFloat(extent.z), chamferRadius: 0)
        box.firstMaterial?.diffuse.contents = color
        box.firstMaterial?.isDoubleSided = true
        wireFrame.geometry = box
        setupShaderOnGeometry(box)
        self.addChildNode(wireFrame)
    }

    required init?(coder aDecoder: NSCoder) { fatalError("init(coder:) Has Not Been Implemented") }

    //----------------
    // MARK: - Shaders
    //----------------

    /// Sets A Shader To Render The Cube As A Wireframe
    ///
    /// - Parameter geometry: SCNBox
    func setupShaderOnGeometry(_ geometry: SCNBox) {
        guard let path = Bundle.main.path(forResource: "wireframe_shader", ofType: "metal", inDirectory: "art.scnassets"),
            let shader = try? String(contentsOfFile: path, encoding: .utf8) else {

                return
        }

        geometry.firstMaterial?.shaderModifiers = [.surface: shader]
    }

}

To display the bounding box you you would then do something like the following, noting that in my example I have the following variables:

 @IBOutlet var augmentedRealityView: ARSCNView!
 let configuration = ARWorldTrackingConfiguration()
 let augmentedRealitySession = ARSession()

To display the boundingBox prior to detection of the actual object itself, you would call the func loadBoundigBox in viewDidLoad e.g:

/// Creates A Bounding Box From The Data Available From The ARObject In The Local Bundle
func loadBoundingBox(){

    //1. Run Our Session
    augmentedRealityView.session = augmentedRealitySession
    augmentedRealityView.delegate = self

    //2. Load A Single ARReferenceObject From The Main Bundle
    if let objectURL =  Bundle.main.url(forResource: "fox", withExtension: ".arobject"){

        do{
            var referenceObjects = [ARReferenceObject]()
            let object = try ARReferenceObject(archiveURL: objectURL)

            //3. Log it's Properties
            print("""
                Object Center = \(object.center)
                Object Extent = \(object.extent)
                Object Scale = \(object.scale)
                """)

            //4. Get It's Scale
            let scale = CGFloat(object.scale.x)

            //5. Create A Bounding Box
            let boundingBoxNode = BlackMirrorzBoundingBox(points: object.rawFeaturePoints.points, scale: scale)

            //6. Add It To The ARSCNView
            self.augmentedRealityView.scene.rootNode.addChildNode(boundingBoxNode)

            //7. Position It 0.5m Away From The Camera
            boundingBoxNode.position = SCNVector3(0, -0.5, -0.5)

            //8. Add It To The Configuration
            referenceObjects.append(object)
            configuration.detectionObjects = Set(referenceObjects)

        }catch{
            print(error)
        }

    }

    //9. Run The Session
    augmentedRealitySession.run(configuration, options: [.resetTracking, .removeExistingAnchors])
    augmentedRealityView.automaticallyUpdatesLighting = true
}

The above example simple creates a boundingBox from the non-detected ARReferenceObject and places it 0.5m down from and 0.5meter away from the Camera which yields something like this:

You would of course need to handle the position of the boundBox initially, as well as hoe to handle the removal of the boundingBox 'indicator'.

The method below simply shows a boundBox when the actual object is detected e.g:

//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------

extension ViewController: ARSCNViewDelegate{

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        //1. Check We Have A Valid ARObject Anchor 
        guard let objectAnchor = anchor as? ARObjectAnchor else { return }

        //2. Create A Bounding Box Around Our Object
        let scale = CGFloat(objectAnchor.referenceObject.scale.x)
        let boundingBoxNode = BlackMirrorzBoundingBox(points: objectAnchor.referenceObject.rawFeaturePoints.points, scale: scale)
        node.addChildNode(boundingBoxNode)

    }

}

Which yields something like this:

In regard to the detection timer, there is an example in the Apple Sample Code, which displays how long it takes to detect the model.

In its crudest form (not accounting for milliseconds) you can do something like so:

Firstly create A Timer and a var to store the detection time e.g:

var detectionTimer = Timer()

var detectionTime: Int = 0

Then when you run your ARSessionConfiguration initialise the timer e.g:

/// Starts The Detection Timer
func startDetectionTimer(){

     detectionTimer = Timer.scheduledTimer(timeInterval: 1.0, target: self, selector: #selector(logDetectionTime), userInfo: nil, repeats: true)
}

/// Increments The Total Detection Time Before The ARReference Object Is Detected
@objc func logDetectionTime(){
    detectionTime += 1

}

Then when an ARReferenceObject has been detected invalidate the timer and log the time e.g:

//--------------------------
// MARK: - ARSCNViewDelegate
//--------------------------

extension ViewController: ARSCNViewDelegate{

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {

        //1. Check We Have A Valid ARObject Anchor
        guard let _ = anchor as? ARObjectAnchor else { return }

        //2. Stop The Timer
        detectionTimer.invalidate()

        //3. Log The Detection Time
        print("Total Detection Time = \(detectionTime) Seconds")

        //4. Reset The Detection Time
        detectionTime = 0

    }

}

This should be more than enough to get your started...

And please note, that this example doesn't provide a boundingBox when scanning an object (look at the Apple Sample Code for that), it provides one based on an existing ARReferenceObject which is implied in your question (assuming I interpreted it correctly).

这篇关于使用 ARKit 2 检测对象时显示边界框的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆