与 ARCore 和 ARKit 相比,Vuforia 有什么限制吗? [英] Are there any limitations in Vuforia compared to ARCore and ARKit?

查看:25
本文介绍了与 ARCore 和 ARKit 相比,Vuforia 有什么限制吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是增强现实领域的初学者,致力于使用智能手机创建建筑平面图(平面图、房间平面图等具有准确测量值)的应用程序.因此,我正在研究可用于此目的的最佳 AR SDK.将 Vuforia 与 ARCore 和 ARKit 进行较量的文章并不多.

请推荐最适合使用的 SDK,以及每种 SDK 的优缺点.

解决方案

更新时间:2021 年 9 月 24 日.

TL;DR

Google ARCore 允许您构建适用于 Android 和 iOS 的应用.使用 Apple ARKit,您可以为 iOS 构建应用程序;使用 Apple RealityKit – 适用于 iOS 和 macOS.一个很棒的旧 PTC Vuforia 旨在为 Android、iOS 和通用 Windows 平台创建应用程序.

Vuforia 的一个关键特性是它使用 ARCore/ARKit 技术,如果它运行的硬件支持它,否则 Vuforia 使用自己的 AR 技术和引擎,称为无依赖硬件的软件解决方案.

<块引用>

在为 Android OEM 智能手机开发时,您可能会遇到一个令人不快的问题:来自不同制造商的设备需要传感器校准才能观察到相同的 AR 体验.幸运的是,Apple 小工具没有这样的缺点,因为那里使用的所有传感器都是在相同条件下校准的.

让我把事情放在第一位.



Google ARCore 1.27

ARCore 于 2018 年 3 月发布.ARCore 基于三个主要的


苹果 ARKit 5.0

ARKit 于 2017 年 6 月发布.与其竞争对手一样,ARKit 也使用特殊技术进行跟踪,但其名称为Visual Inertial Odometry.VIO 用于非常准确地跟踪设备周围的世界.VIO 与 ARCore 中的 COM 非常相似.ARKit中也有三个类似的基本概念:World TrackingScene Understanding(包括四个阶段:平面检测光线投射光估计


Apple RealityKit 2.0

特别注意

RealityKit 为您提供了一组丰富的 AR/VR 工具:新的声明式 Swift 语法、3D 基元、带纹理的 PBR 材料、遮挡和视频材料,


PTC Vuforia 10.1

2015 年 10 月,PTC 以 6500 万美元从高通手中收购了 Vuforia.考虑到高通在2010年推出了Vuforia,所以Vuforia是AR家族的姐姐.大姐在看着你们,伙计们!;)

2016 年 11 月,Unity Technologies 和 PTC 宣布了一项战略合作,以简化 AR 开发.从那时起,他们共同将 Vuforia AR 平台的新功能集成到 Unity 游戏引擎中.Vuforia 可用于 Unity、MS Visual Studio、Apple Xcode 和 Android Studio 等开发环境.支持HoloLens、Magic Leap、Vuzix M400、ODG R7等多种智能手机、平板电脑和AR智能眼镜.

Vuforia 引擎拥有与最新版本的 ARKit 大致相同的主要功能,但它也有自己的功能,例如具有深度学习的模型目标、用于无标记 AR 体验的 VISLAM 以及对 iOS 的外部摄像头支持,新ARCore 和 ARKit 的实验性 API 以及对行业最新 AR 眼镜的支持.Vuforia 相对于 ARKit 和 ARCore 的主要优势在于它具有

以下是用 C# 编写的 AR Foundation 代码片段:

private void UpdatePlacementPose() {var screenCenter = Camera.main.ViewportToScreenPoint(new Vector3(0.5f, 0.5f));var hits = new List();arOrigin.Raycast(screenCenter, hits, TrackableType.Planes);placementPoseIsValid = hits.Count >0;如果(placementPoseIsValid){placePose = hits[0].pose;var cameraForward = Camera.current.transform.forward;var cameraBearing = new Vector3(cameraForward.x,0,cameraForward.z).标准化;PlacementPose.rotation = Quaternion.LookRotation(cameraBearing);}}

Vuforia SDK 定价选项:

  • 免费许可——您只需要注册一个免费的

    以下是优点和缺点.

    |------------------------------|------------------------------||Vuforia PRO"|Vuforia CON"||------------------------------|----------------------------||支持安卓、iOS、UWP |价格不合理||------------------------------|----------------------------||很多支持的设备|糟糕的开发人员文档 ||------------------------------|----------------------------||外置摄像头支持 |SDK 有一些问题和错误 ||------------------------------|----------------------------||网络摄像头/模拟器播放模式 |不支持地理追踪 ||------------------------------|----------------------------||圆柱目标支持 |Unity 潜力不足 ||------------------------------|----------------------------|




    CONCLUSION :

    There are no vital limitations when developing with PTC Vuforia compared to ARCore and ARKit. Vuforia is an old great product and it supports a wider list of Apple and Android devices (even those that are not officially supported) and it supports several latest models of AR glasses.

    But in my opinion, ARKit with a Reality Family toolkit (RealityKit, Reality Composer and Reality Converter) have an extra bunch of useful up-to-date features that Vuforia and ARCore just partially have. ARKit personally has a better short-distance measurement accuracy within a room than any ARCore compatible device has, without any need for calibration. This is achieved through the use of Apple LiDAR dToF scanner. ARCore now uses iToF cameras with Raw Depth API. Both iToF and LiDAR allow you create a high-quality virtual mesh with OcclusionMaterial for real-world surfaces at scene understanding stage. This mesh is ready-for-measurement and ready-for-collision. With iToF and dToF sensors, frameworks instantly detect non-planar surfaces and surfaces with no-features-at-all, such as texture-free white walls in a poorly-lit rooms.

    If you implement iBeacon tools, ARWorldMaps and support for GPS – it will help you eliminate many tracking errors accumulated over time. And ARKit's tight integration with Vision and CoreML frameworks makes a huge contribution to a robust AR toolset. Integration with Apple Maps allows ARKit put GPS Location Anchors outdoors with a highest possible precision at the moment.

    Vuforia's measurement accuracy depends on what platform you're developing for. Some of Vuforia features are built on top of the tracking engine (ARKit or ARCore). Even popular Vuforia Chalk application uses ARKit positional tracker.

    I am a beginner in the field of augmented reality, working on applications that create plans of buildings (floor plan, room plan, etc with accurate measurements) using a smartphone. So I am researching about the best AR SDK which can be used for this. There are not many articles pitting Vuforia against ARCore and ARKit.

    Please suggest the best SDK to use, pros and cons of each.

    解决方案

    Updated: September 24, 2021.

    TL;DR

    Google ARCore allows you build apps for Android and iOS. With Apple ARKit you can build apps for iOS; with Apple RealityKit – for iOS and macOS. And a great old PTC Vuforia was designed to create apps for Android, iOS and Universal Windows Platform.

    A crucial Vuforia's peculiarity is that it uses ARCore/ARKit technology if the hardware it's running on supports it, otherwise Vuforia uses its own AR technology and engine, known as software solution without dependant hardware.

    When developing for Android OEM smartphones, you may encounter an unpleasant issue: devices from different manufacturers need a sensors’ calibration in order to observe the same AR experience. Luckily, Apple gadgets have no such drawback because all sensors used there were calibrated under identical conditions.

    Let me put first things first.



    Google ARCore 1.27

    ARCore was released in March 2018. ARCore is based on the three main fundamental concepts : Motion Tracking, Environmental Understanding and Light Estimation. ARCore allows a supported mobile device to track its position and orientation relative to the world in 6 degrees of freedom (6DoF) using special technique called Concurrent Odometry and Mapping. COM helps us detect the size and location of horizontal, vertical and angled tracked surfaces. Motion Tracking works robustly thanks to optical data coming from a RGB camera at 60 fps, combined with inertial data coming from gyroscope and accelerometer at 1000 fps, and depth data coming from ToF sensor at 60 fps. Surely, ARKit, Vuforia and other AR libraries operate almost the same way.

    

When you move your phone through the real environment, ARCore tracks a surrounding space to understand where a smartphone is, relative to the world coordinates. At tracking stage, ARCore "sows" so called feature points. These feature points are visible through RGB camera, and ARCore uses them to compute phone's location change. The visual data then must be combined with measurements from IMU (Inertial Measurement Unit) to estimate the position and orientation of the ArCamera over time. If a phone isn't equipped with ToF sensor, ARCore looks for clusters of feature points that appear to lie on horizontal, vertical or angled surfaces and makes these surfaces available to your app as planes (we call this technique Plane Detection). After detection process you can use these planes to place 3D objects in your scene. Virtual geometry with assigned shaders will be rendered by ARCore's companion – Sceneform supporting a real-time Physically Based Rendering (a.k.a. PBR) engine – Filament.

    Notwithstanding the above, at this moment Sceneform repository has been archived and it no longer actively maintaining by Google. The last released version was Sceneform 1.17.1. That may sound strange but ARCore team member said "there's no direct replacement for Sceneform library and ARCore developers are free to use any 3D game library with Android AR apps (video from GoogleIO'21 – time 06:20).

    
ARCore's environmental understanding lets you place 3D objects with a correct depth occlusion in a way that realistically integrates with the real world. For example, you can place a virtual cup of coffee on the table using Depth hit-testing and ArAnchors.

    
ARCore can also define lighting parameters of a real environment and provide you with the average intensity and color correction of a given camera image. This data lets you light your virtual scene under the same conditions as the environment around you, considerably increasing the sense of realism.



    Current ARCore version has such a significant APIs as Raw Depth API and Full Depth API, Lighting Estimation, Augmented Faces, Augmented Images, Instant Placement, Debugging Tools, 365-days Cloud Anchors, Recording and Playback and Multiplayer support. The main advantage of ARCore in Android Studio over ARKit in Xcode is Android Emulator allowing you run and debug AR apps using virtual device.



    This table presents the difference between Raw Depth API and Full Depth API:

    |------------|--------------------|--------------------|
    |            |  "Raw Depth API"   |  "Full Depth API"  |
    |------------|--------------------|--------------------|
    |  Accuracy  |       Awesome      |         Bad        |
    |------------|--------------------|--------------------|
    |  Coverage  |   Not all pixels   |     All pixels     |
    |------------|--------------------|--------------------|
    |  Distance  |    0.5 to 5.0 m    |     0 to 8.0 m     |
    |------------|--------------------|--------------------|
    

    ARCore is older than ARKit. Do you remember Project Tango released in 2014? Roughly speaking, ARCore is just a rewritten Tango SDK. But a wise acquisition of FlyBy Media, Faceshift, MetaIO, Camerai and Vrvana helped Apple not only to catch up but significantly overtake Google. Suppose it's good for AR industry.

    The latest version of ARCore supports OpenGL ES acceleration, and integrates with Unity, Unreal, and Web applications. At the moment the most powerful and energy efficient chipsets for AR experience on Android platform are Snapdragon 888 Plus (5nm), Exynos 2100 (5nm) and Kirin 9000 (5nm) – now Google and Huawei are almost friends again.

    ARCore price: FREE.

    |------------------------------|------------------------------|
    |        "ARCore PROs"         |        "ARCore CONs"         | 
    |------------------------------|------------------------------|
    | iToF and Depth API support   | No Body Tracking support     |
    |------------------------------|------------------------------|
    | Quick Plane Detection        | Cloud Anchors hosted online  |
    |------------------------------|------------------------------|
    | Long-distance-accuracy       | Lack of rendering engines    |
    |------------------------------|------------------------------|
    | ARCore Emulator in AS        | Poor developer documentation | 
    |------------------------------|------------------------------|
    | High-quality Lighting API    | No external camera support   |
    |------------------------------|------------------------------|
    | A lot of supported devices   | Poor Google Glass API        |
    |------------------------------|------------------------------|
    

    Here's ARCore code's snippet written in Kotlin:

    private fun addNodeToScene(fragment: ArFragment, 
                                 anchor: Anchor, 
                             renderable: Renderable) {
        
        val anchorNode = AnchorNode(anchor)
        anchorNode.setParent(fragment.arSceneView.scene)
        
        val modelNode = TransformableNode(fragment.transformationSystem)
        modelNode.setParent(anchorNode)
        modelNode.setRenderable(renderable)
        modelNode.localPosition = Vector3(0.0f, 0.0f, -3.0f)
        fragment.arSceneView.scene.addChild(anchorNode)
        
        modelNode.select()
    }
    




    Apple ARKit 5.0

    ARKit was released in June 2017. Like its competitors, ARKit also uses special technique for tracking, but its name is Visual Inertial Odometry. VIO is used to very accurately track the world around your device. VIO is quite similar to COM found in ARCore. There are also three similar fundamental concepts in ARKit: World Tracking, Scene Understanding (which includes four stages: Plane Detection, Ray-Casting, Light Estimation, Scene Reconstruction), and Rendering with a great help of ARKit companions – SceneKit framework, that’s actually an Apple 3D game engine since 2012, RealityKit framework specially made for AR and written in Swift from scratch (released in 2019), and SpriteKit framework with its 2D engine (since 2013).

    VIO fuses RGB sensor data at 60 fps with Core-Motion data (IMU) at 1000 fps and LiDAR data. In addition to that, It should be noted that due to a very high energy impact (because of an enormous burden on CPU and GPU), your iPhone's battery will be drained pretty quickly. The same can be said about Android devices.

    ARKit has a handful of useful approaches for robust tracking and accurate measurements. Among its arsenal you can find easy-to-use functionality for saving and retrieving ARWorldMaps. World map is an indispensable "portal" for Persistent and Multiuser AR experience that allows you to come back to the same environment filled with the same chosen 3D content just before the moment your app became inactive. Support for simultaneous front and back camera capture and support for collaborative sessions, is also great.

    There are good news for gamers: up to 6 people are simultaneously able to play the same AR game, thanks to MultipeerConnectivity framework. For 3D geometry you could use a brand-new USDZ file format, developed and supported by Pixar. USDZ is a good choice for sophisticated 3D models with multiple PBR shaders, physics, animations and spacial sound. Also you can use the following 3D formats for ARKit.

    ARKit can also help you perform People and Objects Occlusion technique (based on alpha and depth channels' segmentation), LiDAR Scene Reconstruction, Body Motion Capture tracking, Vertical and Horizontal Planes detection, Image detection, 3D Object detection and 3D Object scanning. With People and Objects Occlusion tool your AR content realistically passes behind and in front of real world entities, making AR experiences even more immersive. Realistic reflections, that use machine learning algorithms, and Face tracking experience allowing you to track up to 3 faces at a time, are also available for you.



    Using ARKit and iBeacons, you assist an iBeacon-aware application to know what room it’s in, and show a right 3D content chosen for that room. Working with ARKit you should intensively exploit ARAnchor class and all its subclasses.


    Pay particular attention to RealityKit's satellite – Reality Composer app that's now a part of Xcode. This brand-new app helps you prototype AR scene. Scenes built in Reality Composer can be packed with dynamics, simple animations and PBR shaders. Reality Composer can be installed on iOS and iPadOS as a standalone app.

    For creating ARKit 5.0 apps you need macOS Monterey, Xcode 13 and device running iOS 15. ARKit is a worthy candidate to marry Metal framework for GPU acceleration. Don’t forget that ARKit tightly integrates with Unity and Unreal. At the moment the most powerful and energy efficient chipsets for AR experience are Apple M1 (5nm) and A15 Bionic (5nm).

    ARKit price: FREE.

    |------------------------------|------------------------------|
    |         "ARKit PROs"         |         "ARKit CONs"         | 
    |------------------------------|------------------------------|
    | LiDAR and Depth API support  | No AR glasses support        |
    |------------------------------|------------------------------|
    | Stable 6 DoF World Tracking  | No auto-update for Anchors   |
    |------------------------------|------------------------------|
    | Collaborative Sessions       | iOS / Chipsets Restrictions  |
    |------------------------------|------------------------------|
    | WorldMaps, iBeacon-awareness | No ARKit Simulator in Xcode  |
    |------------------------------|------------------------------|
    | 4 rendering technologies     | No external camera support   |
    |------------------------------|------------------------------|
    | Rich developer documentation | Quickly drains your battery  |
    |------------------------------|------------------------------|
    

    Here's ARKit code's snippet written in Swift:

    func renderer(_ renderer: SCNSceneRenderer, 
                 didAdd node: SCNNode, 
                  for anchor: ARAnchor) {
        
        guard let planeAnchor = anchor as? ARPlaneAnchor else { return }
        let planeNode = tableTop(planeAnchor)
        node.addChildNode(planeNode)
    }
        
    func tableTop(_ anchor: ARPlaneAnchor) -> SCNNode {
        
        let x = CGFloat(anchor.extent.x)
        let z = CGFloat(anchor.extent.z)
        
        let tableNode = SCNNode()
        tableNode.geometry = SCNPlane(width: x, height: z)
        tableNode.position = SCNVector3(anchor.center.x, 0, anchor.center.z)
        return tableNode
    }
    




    Apple RealityKit 2.0

    Pay particular attention to RealityKit that was introduced in WWDC 2019. There’s been a lot of hype around it since then. RealityKit supports Entity-Component-System paradigm and allows you create AR/VR experiences for iOS/macOS. This high-level framework works with .usdz assets, .rcproject and .reality files which you can import from standalone macOS/iOS app – Reality Composer (RC). Cupertino software engineers built RealityKit from the ground for augmented reality apps that you can create with no repetitive code. It works with Swift from scratch – there’s no Objective-C legacy. And, of course, RealityKit shines not only with SwiftUI and UIKit but with Metal too.

    RealityKit framework is built on two fundamental blocks: a ModelEntity class (that depends on MeshResource and Materials), and an AnchorEntity class (that automatically tracks target, unlike ARAnchor in ARKit).

    RealityKit gives you a rich set of tools to work with AR/VR: new declarative Swift syntax, 3D primitives, PBR materials with textures, occlusion and video materials, lighting fixtures with realistic ray-traced shadows, spacial audio processing, 10 different anchors types, simplified setup for collaborative sessions, robust animations' and physics' setup, indispensable AI and ML built-in features and many other useful things.

    There is a generally accepted opinion that all AR frameworks are much better and faster in defining horizontal surfaces as opposed to vertical ones. RealityKit, like all modules considered here, is not an exception to this rule.

    Reality Composer's simple and intuitive UI is good for prototyping AR scenes. RC has a royalty free library with downloadable 3D assets that allow you construct sophisticated 3D scenes with animation, audio, and dynamics which contain a thorough description of how these objects were built or behaved. You can also export your composition as a lightweight AR Quick Look experience that lets users place and preview a content. In Reality Composer you can start a project using one of five anchor types: horizontal, vertical, image, face and object – which corresponds to desired type of tracking.

    RealityKit and Reality Composer price: FREE.

    |------------------------------|------------------------------|
    |       "RealityKit PROs"      |      "RealityKit CONs"       | 
    |------------------------------|------------------------------|
    | Can create AR apps w/o ARKit | Intensive usage of CPU/GPU   |
    |------------------------------|------------------------------|
    | Very little boilerplate code | iOS 13+, macOS 10.15+ only   |
    |------------------------------|------------------------------|
    | Suitable for AR/VR projects  | Start lagging on old devices |
    |------------------------------|------------------------------|
    | Robust API for RC scenes     | There's no particle system   |
    |------------------------------|------------------------------|
    | Asynchronous asset loading   | Lack of Apple documentation  |
    |------------------------------|------------------------------|
    | Autoupdating tracking target | No AR glasses support        |
    |------------------------------|------------------------------|
    

    Here's RealityKit code's snippet written in Swift:

    override func viewDidLoad() {
        super.viewDidLoad()
        
        let textAnchor = try! SomeText.loadTextScene()
        let textEntity: Entity = textAnchor.realityComposer!.children[0]
        var textMC: ModelComponent = textEntity.children[0].components[ModelComponent]!
        
        var material = SimpleMaterial()
        material.baseColor = .color(.yellow)
        textMC.materials[0] = material
        textMC.mesh = .generateText("Hello, RealityKit")
        textAnchor.realityComposer!.children[0].children[0].components.set(textMC)
        arView.scene.anchors.append(textAnchor)
    }
    

    One more important part of Apple's AR ecosystem is Reality Converter app. Now, instead of using a command line conversion tool, you can use a Reality Converter. The brand-new app makes it easy for you to convert, view and customize .usdz 3D objects on Mac. Simply drag-and-drop common 3D file formats, such as .obj, .gltf or .usd, to view the converted .usdz result, customize material properties with your own textures and file metadata. You can even preview your .usdz object under a variety of lighting conditions with built-in Image-Based Lighting (IBL) options.




    PTC Vuforia 10.1

    In October 2015 PTC acquired Vuforia from Qualcomm for $65 million. Take into consideration that Qualcomm launched Vuforia in 2010. So Vuforia is an older sister in AR family. Big sister is watching you, guys! ;)

    In November 2016 Unity Technologies and PTC announced a strategic collaboration to simplify AR development. Since then they work together integrating new features of the Vuforia AR platform into the Unity game engine. Vuforia can be used with such development environments as Unity, MS Visual Studio, Apple Xcode and Android Studio. It supports a wide range of smartphones, tablets and AR smart glasses, such as HoloLens, Magic Leap, Vuzix M400, and ODG R7.

    Vuforia Engine boasts approximately the same main capabilities that you can find in the latest versions of ARKit but also it has its own features, such as Model Targets with Deep Learning, VISLAM for markerless AR experience and External Camera support for iOS, new experimental APIs for ARCore and ARKit and support for industry latest AR glasses. The main advantage of Vuforia over ARKit and ARCore that it has a wider list of supported devices and it supports the development of Universal Windows Platform apps for Intel-based Windows 10 devices, including Microsoft Surface and HoloLens.

    Vuforia has a standalone version and a version baked directly into Unity. It has the following functionality:

    • Advanced Model Targets 360, recognition powered by AI;
    • Model Targets with Deep Learning, allow to instantly recognize objects by shape using pre-existing 3D models and ML algorithms;
    • Image Targets, the easiest way to put AR content on flat objects;
    • Multi Targets, for objects with flat surfaces and multiple sides;
    • Cylinder Targets, for placing AR content on objects with cylindrical shapes, like bottles;
    • Static Device Tracker, is ideal for apps where the device will remain static, like on a tripod;
    • Ground Plane, as a part of Smart Terrain, enables digital content to be placed on floors and tabletop surfaces;
    • VuMarks, allows identify and add content to series of objects;
    • Object Targets, for scanning an object;
    • Static and Adaptive Modes, for stationary and moving objects;
    • Simulation Play Mode, allows developers to "walk through" or around the 3D model and see the final AR experience from their computer;
    • Vuforia Area Target Creator app, enables us to scan and generate new Area Targets by using a depth-enabled mobile devices;
    • AR Session Recorder, can record AR experiences in the location, and then use that recording with Playback mode in Unity for editing and updating;
    • and, of course, Vuforia Fusion and Vuforia Engine Area Targets.

    Vuforia Fusion is a capability designed to solve the problem of fragmentation in AR enabling technologies such as cameras, sensors, chipsets, and software frameworks like ARKit. With Vuforia Fusion, your app will automatically provide the best experience possible with no extra work required on your end.

    Vuforia Engine Area Targets enable developers to use an entire space, be it a factory floor or retail store, as AR target. Using a first supported device, a Matterport Pro2 camera, developers can create a detailed 3D scan of a desired location. Locations are recommended to be indoors, mostly static, and no larger than 1,000 sqm (around 10,000 sqft). Once the scan produces a 3D model it can be converted into an Area Target with the Vuforia Area Target Generator. This target can then be brought into Unity.

    Vuforia API allows for a Static or Adaptive mode. When the real-world model remains stationary, like a large industrial machine, implementing the Static API will use significantly less processing power. This enables a longer lasting and higher performance experience for those models. For objects that won’t be stationary the Adaptive API allows for a continued robust experience.

    The External Camera feature is a part of the Vuforia Engine Driver Framework. External Camera provides a new perspective on what’s possible with Augmented Reality. It allows Vuforia Engine to access external video sources beyond the camera equipped in phones and tablets. By using an independent camera, developers can create an AR experience that offers a first-person view from toys, robots or industrial tools.

    Occlusion Management is one of the key features for building a realistic AR experience. When you're using Occlusion Management, Vuforia Engine detects and tracks targets, even when they’re partially hidden behind everyday barriers, like your hand. Special occlusion handling allows apps to display graphics as if they appear inside physical objects.

    Vuforia supports Metal acceleration for iOS devices. Also you can use Vuforia Samples for your projects. For example: the Vuforia Core Samples library includes various scenes using Vuforia features, including a pre-configured Object Recognition scene that you can use as a reference and starting point for Object Recognition application.

    Here's AR Foundation code's snippet written in C#:

    private void UpdatePlacementPose() {
    
        var screenCenter = Camera.main.ViewportToScreenPoint(new Vector3(0.5f, 0.5f));
        var hits = new List<ARRaycastHit>();
        arOrigin.Raycast(screenCenter, hits, TrackableType.Planes);
        
        placementPoseIsValid = hits.Count > 0;
    
        if (placementPoseIsValid) {
    
            placementPose = hits[0].pose;
        
            var cameraForward = Camera.current.transform.forward;
            var cameraBearing = new Vector3(cameraForward.x, 
                                            0, 
                                            cameraForward.z).normalized;
    
            placementPose.rotation = Quaternion.LookRotation(cameraBearing);
        }
    }
    

    Vuforia SDK Pricing Options:

    • Free license – you just need to register for a free Development License Key

    • Basic license ($42/month, billed annually) – For Students

    • Basic + Cloud license ($99/month) – For Small Businesses

    • Agency Package (personal price) – 5 short-term licenses

    • Pro license (personal price) – For All Companies Types

    Here are Pros and Cons.

    |------------------------------|------------------------------|
    |       "Vuforia PROs"         |        "Vuforia CONs"        | 
    |------------------------------|------------------------------|
    | Supports Android, iOS, UWP   | The price is not reasonable  |
    |------------------------------|------------------------------|
    | A lot of supported devices   | Poor developer documentation |
    |------------------------------|------------------------------|
    | External Camera support      | SDK has some issues and bugs |
    |------------------------------|------------------------------|
    | Webcam/Simulator Play Mode   | Doesn't support Geo tracking |
    |------------------------------|------------------------------|
    | Cylinder Targets support     | Poor potential in Unity      |
    |------------------------------|------------------------------|
    




    CONCLUSION :

    There are no vital limitations when developing with PTC Vuforia compared to ARCore and ARKit. Vuforia is an old great product and it supports a wider list of Apple and Android devices (even those that are not officially supported) and it supports several latest models of AR glasses.

    But in my opinion, ARKit with a Reality Family toolkit (RealityKit, Reality Composer and Reality Converter) have an extra bunch of useful up-to-date features that Vuforia and ARCore just partially have. ARKit personally has a better short-distance measurement accuracy within a room than any ARCore compatible device has, without any need for calibration. This is achieved through the use of Apple LiDAR dToF scanner. ARCore now uses iToF cameras with Raw Depth API. Both iToF and LiDAR allow you create a high-quality virtual mesh with OcclusionMaterial for real-world surfaces at scene understanding stage. This mesh is ready-for-measurement and ready-for-collision. With iToF and dToF sensors, frameworks instantly detect non-planar surfaces and surfaces with no-features-at-all, such as texture-free white walls in a poorly-lit rooms.

    If you implement iBeacon tools, ARWorldMaps and support for GPS – it will help you eliminate many tracking errors accumulated over time. And ARKit's tight integration with Vision and CoreML frameworks makes a huge contribution to a robust AR toolset. Integration with Apple Maps allows ARKit put GPS Location Anchors outdoors with a highest possible precision at the moment.

    Vuforia's measurement accuracy depends on what platform you're developing for. Some of Vuforia features are built on top of the tracking engine (ARKit or ARCore). Even popular Vuforia Chalk application uses ARKit positional tracker.

    这篇关于与 ARCore 和 ARKit 相比,Vuforia 有什么限制吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆