ARCore 使用哪些传感器? [英] What sensors does ARCore use?

查看:30
本文介绍了ARCore 使用哪些传感器?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

ARCore 使用哪些传感器:兼容手机中的单摄像头、双摄像头、IMU 等?

What sensors does ARCore use: single camera, dual-camera, IMU, etc. in a compatible phone?

此外,如果传感器无法通过切换到不太准确的自身版本而无法使用,那么 ARCore 是否足够动态以继续工作?

Also, is ARCore dynamic enough to still work if a sensor is not available by switching to a less accurate version of itself?

推荐答案

更新时间:2021 年 7 月 24 日.

Google 的 ARCore 以及 Apple 的 ARKit 使用一组类似的传感器来跟踪真实环境.ARCore 可以使用单个 RGB 摄像头和 IMU,什么是加速度计磁力计陀螺仪.您的手机以 60fps 运行世界跟踪,而 Inertial Measurement Unit1000Hz 运行.此外,ARCore 中还有一种传感器可以使用——用于场景重建的 iToF 相机(苹果公司的名称是 LiDAR).ARCore 1.25 支持 Raw Depth API 和 Full Depth API.

Google's ARCore, as well as Apple's ARKit, use a similar set of sensors to track a real-world environment. ARCore can use a single RGB camera along with IMU, what is a combination of an accelerometer, magnetometer and a gyroscope. Your phone runs world tracking at 60fps, while Inertial Measurement Unit operates at 1000Hz. Also, there is one more sensor that can be used in ARCore – iToF camera for scene reconstruction (Apple's name is LiDAR). ARCore 1.25 supports Raw Depth API and Full Depth API.

阅读谷歌关于基于相机 + IMU 的 COM 方法的评论:

Read what Google says about it about COM method, built on Camera + IMU:

Concurrent Odometry and Mapping – 电子设备跟踪其在环境中的运动,同时构建环境的 3D 视觉表示,用于修复漂移跟踪的运动.

Concurrent Odometry and Mapping – An electronic device tracks its motion in an environment while building a three-dimensional visual representation of the environment that is used for fixing a drift in the tracked motion.

这是 Google US15595617 专利:用于并发里程计和映射.

Here's Google US15595617 Patent: System and method for concurrent odometry and mapping.


  • 在 2014...2017 年 Google 倾向于 Multicam + DepthCam 配置(Tango 项目)
  • 2018...2020 谷歌倾向于SingleCam + IMU config
  • 在 2021 年,Google 恢复了 Multicam + DepthCam 配置
  • in 2014...2017 Google tended towards Multicam + DepthCam config (Tango project)
  • in 2018...2020 Google tended to SingleCam + IMU config
  • in 2021 Google returned to Multicam + DepthCam config


我们都知道 Android 设备最大的问题是校准.iOS 设备没有这个问题(因为 Apple 控制着自己的硬件和软件).低质量的校准会导致 3D 跟踪错误,因此您的所有虚拟 3D 对象可能会漂浮"在物体上.在跟踪不佳的场景中.如果您使用没有 iToF 传感器的手机,则没有防止不良跟踪的神奇按钮(并且您无法切换到不太准确的跟踪版本).在这种情况下,唯一的解决方案是从头开始重新跟踪场景.但是,当您的设备配备 ToF 摄像头时,跟踪质量会更高.

We all know that the biggest problem for Android devices is a calibration. iOS devices don't have this issue ('cause Apple controls its own hardware and software). A low quality of calibration leads to errors in 3D tracking, hence all your virtual 3D objects might "float" in a poorly-tracked scene. In case you use a phone without iToF sensor, there's no miraculous button against bad tracking (and you can't switch to a less accurate version of tracking). The only solution in such a situation is to re-track your scene from scratch. However, a quality of tracking is much higher when your device is equipped with ToF camera.

以下是获得良好跟踪结果的四个主要规则(如果您没有 ToF 相机):

Here are four main rules for good tracking results (if you have no ToF camera):

  1. 跟踪你的场景不要太快也不要太慢

  1. Track your scene not too fast, not too slow

跟踪适当的表面和物体

跟踪时使用光线充足的环境

Use well lit environment when tracking

不要跟踪折射物体的反射

Don't track reflected of refracted objects

水平面比垂直面更可靠


ARCore 的最大问题之一(也是 ARKit 问题)是 Energy Impact.我们知道帧速率越高,跟踪的结果就越好.但是 30 fps 时的能量影响是HIGH,而在 60 fps 时它非常高.这种能量影响会迅速耗尽智能手机的电池电量(由于 CPU/GPU 的巨大负担).因此,想象一下您为 ARCore 使用 2 个摄像头——您的手机必须以 60 fps 的速度并行处理 2 个图像序列,以及处理和存储特征点和 AR 锚点,同时,手机必须同时渲染动画 3D 图形具有 60 fps 的高分辨率纹理.这对您的 CPU/GPU 来说太过分了.在这种情况下,电池将在 30 分钟内耗尽,并且会像锅炉一样热)).用户似乎不喜欢它,因为这不是很好的 AR 体验.

The one of the biggest problems of ARCore (that's ARKit problem too) is an Energy Impact. We understand that the higher frame rate is – the better tracking's results are. But the Energy Impact at 30 fps is HIGH and at 60 fps it's VERY HIGH. Such an energy impact will quickly drain your smartphone's battery (due to an enormous burden on CPU/GPU). So, just imagine that you use 2 cameras for ARCore – your phone must process 2 image sequences at 60 fps in parallel as well as process and store feature points and AR anchors, and at the same time, a phone must simultaneously render animated 3D graphics with Hi-Res textures at 60 fps. That's too much for your CPU/GPU. In such a case, a battery will be dead in 30 minutes and will be as hot as a boiler)). It seems users don't like it because this is not-good AR experience.

这篇关于ARCore 使用哪些传感器?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆