iOS - 计算距离,方位角,高度和相对位置(增强现实) [英] iOS - Calculating distance, azimuth, elevation and relative position (Augmented Reality)

查看:185
本文介绍了iOS - 计算距离,方位角,高度和相对位置(增强现实)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我开始构建一个增强现实应用程序,您可以在屏幕上将图像放置在增强现实摄像机视图上,并且它保持在地球上的那个位置,因此其他拥有相机视图的人可以来看看它在增强现实相机视图上。为此我知道我需要计算某种距离因子以及方位角和仰角。



所以,我已经想出了如何将对象的图形发送到服务器并将其恢复,但是如何将其放回原始位置,相对人间。我知道我需要计算它:




  • Altitude

  • 坐标

  • Azimuth

  • 海拔

  • 距离



<但是,我将如何计算这些并将它们计算在一起/将它们拼凑在一起。我希望你理解我的意思。



为了完善你的理解,让我给你一个应用程序的简短演示:



一个男人在他家里,他决定在他的一面墙上画一幅画。他打开了默认为增强现实屏幕的应用程序,按下加号按钮并从他的照片库中添加图像。在幕后,它将位置和位置数据保存到服务器,有应用程序及其增强现实屏幕的人来到,它上升到服务器并找到附近保存的图像,然后下载图像并将其放置在当他移动时,另一个男人可以用手机看到它。



我应该采取什么方法来实现这一目标?任何大纲,链接,资源,教程,想法,经验等谢谢!这是一个写下来的难题,希望你能理解。如果没有,请告诉我,我会改写。



Rohan

解决方案

我正在开发两个AR iOS应用程序,它们执行以下操作:将方位角(罗盘,水平角度)和高程(陀​​螺仪,垂直角度)转换为3D空间中的位置(例如球形到笛卡尔)。



您需要的框架是:




  • CoreLocation

  • CoreMotion



获取地理位置(坐标)对于纬度,经度和海拔非常简单。您可以在多个在线资源中轻松找到此信息,但这是您在调用 startUpdatingLocation CLLocationManagerDelegate 中进行的主要调用c>:

   - (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations 
{
latitude =(float)manager.location.coordinate.latitude;
longitude =(float)manager.location.coordinate.longitude;
altitude =(float)manager.location.altitude;
}

获取方位角也非常简单,使用相同的代理作为位置在调用 startUpdatingHeading 之后:

   - (void)locationManager:(CLLocationManager *经理didUpdateHeading:(CLHeading *)newHeading 
{
azimuth =(float)manager.heading.magneticHeading;
}

从陀螺仪中提取高程,陀螺仪没有代表,但是也很容易设置。电话看起来像这样(注意:这适用于我在景观模式下运行的应用,请检查你的):

  elevation = fabsf (self.motionManager.deviceMotion.attitude.roll); 

最后,您可以将方向坐标转换为3D点,如下所示:

   - (GLKVector3)sphericalToCartesian :( float)radius azimuth:(float)theta elevation:(float)phi 
{
/ /转换坐标:球形到笛卡儿
//球形:径向距离(r),方位角(θ),高程(φ)
//笛卡儿:x,y,z

float x = radius * sinf(phi)* sinf(theta);
float y = radius * cosf(phi);
float z = radius * sinf(phi)* cosf(theta);
返回GLKVector3Make(x,y,z);
}

对于最后一部分,请注意角度和轴命名约定,因为它们会有所不同从源头到源头。在我的系统中,θ是水平面上的角度,φ是垂直平面上的角度,x是左右,y是下降,z是后面。



至于距离,我不确定你真的需要使用它,但如果你这样做,那就把它替换成半径。



希望这有助于


I am starting to build an augmented reality app where you can place an image on the screen on your augmented reality camera view and it stays in that position on the Earth, so someone else with their camera view can come by and see it on the augmented reality camera view. For this I know I need to calculate some sort of distance factor along with azimuth and elevation.

So, I have already figured out how to send the object's graphics up to a server and retrieve it back, but how can I place it back on its original position, relative to Earth. I know I need to calculate its:

  • Altitude
  • Coordinates
  • Azimuth
  • Elevation
  • Distance

But how would I calculate these and account for them/piece them together. I hope you understand what I mean.

To refine your understanding let me give you a short demo of the app:

A man is in his house, he decides to place an image of a painting on one of his walls. He opens up the app which defaults to the augmented reality screen, he presses the plus button and adds an image from his photo library. Behind the scenes, it saves the location and positional data up to a server, someone with the app and its augmented reality screen comes by, it goes up to the server and finds images saved nearby, it then downloads the image and places it up on the wall so the other man can see it with his phone when he moves it by.

What approach should I take to achieve this? Any outline, links, resources, tutorials, thoughts, experience etc. Thanks! This was a generally hard question to write down, I hope you can understand. If not please tell me and I will reword.

Rohan

解决方案

I'm working on two AR iOS apps which do the following: convert azimuth (compass, horizontal angle) and elevation (gyroscope, vertical angle) to a position in 3D space (e.g. spherical to cartesian).

The frameworks you need are:

  • CoreLocation
  • CoreMotion

Getting the geolocation (coordinates) is pretty straightforward for latitude, longitude, and altitude. You can easily find this information in several online sources, but this is the main call you need from the CLLocationManagerDelegate after you call startUpdatingLocation:

- (void)locationManager:(CLLocationManager *)manager didUpdateLocations:(NSArray *)locations
{
    latitude = (float) manager.location.coordinate.latitude;
    longitude = (float) manager.location.coordinate.longitude;
    altitude = (float) manager.location.altitude;
}

Getting the azimuth angle is also pretty straightforward, using the same delegate as the location after calling startUpdatingHeading:

- (void)locationManager:(CLLocationManager *)manager didUpdateHeading:(CLHeading *)newHeading
{
    azimuth  = (float) manager.heading.magneticHeading;
}

Elevation is extracted from the gyroscope, which doesn't have a delegate but is also easy to set up. The call looks something like this (note: this works for my app running in landscape mode, check yours):

elevation = fabsf(self.motionManager.deviceMotion.attitude.roll);

Finally, you can convert your orientation coordinates into a 3D point like so:

- (GLKVector3)sphericalToCartesian:(float)radius azimuth:(float)theta elevation:(float)phi
{
    // Convert Coordinates: Spherical to Cartesian
    // Spherical: Radial Distance (r), Azimuth (θ), Elevation (φ)
    // Cartesian: x, y, z

    float x = radius * sinf(phi) * sinf(theta);
    float y = radius * cosf(phi);
    float z = radius * sinf(phi) * cosf(theta);
    return GLKVector3Make(x, y, z);
}

For this last part be very wary of angle and axis naming conventions as they vary wildly from source to source. In my system, θ is the angle on the horizontal plane, φ is the angle on the vertical plane, x is left-right, y is down-up, and z is back-front.

As for distance, I'm not sure you really need to use it but if you do then just substitute it for "radius".

Hope that helps

这篇关于iOS - 计算距离,方位角,高度和相对位置(增强现实)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆