iOS - 计算距离、方位角、仰角和相对位置(增强现实) [英] iOS - Calculating distance, azimuth, elevation and relative position (Augmented Reality)

查看:39
本文介绍了iOS - 计算距离、方位角、仰角和相对位置(增强现实)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我开始构建一个增强现实应用程序,您可以在其中将图像放置在增强现实相机视图的屏幕上,并且它会停留在地球上的那个位置,因此其他人可以通过他们的相机视图来查看它在增强现实相机视图上.为此,我知道我需要计算某种距离因子以及方位角和仰角.

I am starting to build an augmented reality app where you can place an image on the screen on your augmented reality camera view and it stays in that position on the Earth, so someone else with their camera view can come by and see it on the augmented reality camera view. For this I know I need to calculate some sort of distance factor along with azimuth and elevation.

所以,我已经想出了如何将对象的图形发送到服务器并将其检索回来,但我如何才能将其放回相对于地球的原始位置.我知道我需要计算它:

So, I have already figured out how to send the object's graphics up to a server and retrieve it back, but how can I place it back on its original position, relative to Earth. I know I need to calculate its:

  • 海拔
  • 坐标
  • 方位角
  • 海拔
  • 距离

但是我将如何计算这些并将它们考虑在内/将它们拼凑在一起.我希望你明白我的意思.

But how would I calculate these and account for them/piece them together. I hope you understand what I mean.

为了加深您的理解,让我给您一个简短的应用演示:

To refine your understanding let me give you a short demo of the app:

一个男人在他的房子里,他决定在他的一面墙上放一幅画的图像.他打开默认为增强现实屏幕的应用程序,按下加号按钮并从他的照片库中添加一张图像.在幕后,它将位置和位置数据保存到服务器上,当有人使用应用程序和增强现实屏幕时,它会到达服务器并找到附近保存的图像,然后下载图像并将其放置在墙,所以当他移动它时,另一个人可以用他的手机看到它.

A man is in his house, he decides to place an image of a painting on one of his walls. He opens up the app which defaults to the augmented reality screen, he presses the plus button and adds an image from his photo library. Behind the scenes, it saves the location and positional data up to a server, someone with the app and its augmented reality screen comes by, it goes up to the server and finds images saved nearby, it then downloads the image and places it up on the wall so the other man can see it with his phone when he moves it by.

我应该采取什么方法来实现这一目标?任何大纲、链接、资源、教程、想法、经验等.谢谢!这是一个通常很难写下来的问题,我希望你能理解.如果不是,请告诉我,我会改写.

What approach should I take to achieve this? Any outline, links, resources, tutorials, thoughts, experience etc. Thanks! This was a generally hard question to write down, I hope you can understand. If not please tell me and I will reword.

罗汉

推荐答案

我正在开发两个 AR iOS 应用程序,它们执行以下操作:将方位角(罗盘、水平角)和仰角(陀螺仪、垂直角)转换为一个位置在 3D 空间中(例如球形到笛卡尔).

I'm working on two AR iOS apps which do the following: convert azimuth (compass, horizontal angle) and elevation (gyroscope, vertical angle) to a position in 3D space (e.g. spherical to cartesian).

您需要的框架是:

  • 核心位置
  • CoreMotion

获取纬度、经度和海拔的地理位置(坐标)非常简单.您可以在多个在线资源中轻松找到此信息,但这是您在调用 startUpdatingLocation 后需要从 CLLocationManagerDelegate 进行的主要调用:

Getting the geolocation (coordinates) is pretty straightforward for latitude, longitude, and altitude. You can easily find this information in several online sources, but this is the main call you need from the CLLocationManagerDelegate after you call startUpdatingLocation:

- (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations
{
    latitude = (float) manager.location.coordinate.latitude;
    longitude = (float) manager.location.coordinate.longitude;
    altitude = (float) manager.location.altitude;
}

获取方位角也很简单,在调用startUpdatingHeading后使用与位置相同的委托:

Getting the azimuth angle is also pretty straightforward, using the same delegate as the location after calling startUpdatingHeading:

- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading
{
    azimuth  = (float) manager.heading.magneticHeading;
}

从陀螺仪中提取海拔,陀螺仪没有代理但也很容易设置.调用看起来像这样(注意:这适用于我在横向模式下运行的应用程序,请检查您的应用程序):

Elevation is extracted from the gyroscope, which doesn't have a delegate but is also easy to set up. The call looks something like this (note: this works for my app running in landscape mode, check yours):

elevation = fabsf(self.motionManager.deviceMotion.attitude.roll);

最后,您可以将方向坐标转换为 3D 点,如下所示:

Finally, you can convert your orientation coordinates into a 3D point like so:

- (GLKVector3)sphericalToCartesian:(float)radius azimuth:(float)theta elevation:(float)phi
{
    // Convert Coordinates: Spherical to Cartesian
    // Spherical: Radial Distance (r), Azimuth (θ), Elevation (φ)
    // Cartesian: x, y, z

    float x = radius * sinf(phi) * sinf(theta);
    float y = radius * cosf(phi);
    float z = radius * sinf(phi) * cosf(theta);
    return GLKVector3Make(x, y, z);
}

对于最后一部分,要非常小心角度和轴的命名约定,因为它们因源而异.在我的系统中,θ 是水平面上的角度,φ 是垂直面上的角度,x 是左右,y 是下上,z 是后前.

For this last part be very wary of angle and axis naming conventions as they vary wildly from source to source. In my system, θ is the angle on the horizontal plane, φ is the angle on the vertical plane, x is left-right, y is down-up, and z is back-front.

至于距离,我不确定您是否真的需要使用它,但如果您使用它,那么只需将其替换为半径"即可.

As for distance, I'm not sure you really need to use it but if you do then just substitute it for "radius".

希望有帮助

这篇关于iOS - 计算距离、方位角、仰角和相对位置(增强现实)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆