ARKit根据触摸位置设置ARAnchor变换 [英] ARKit set ARAnchor transform based on touch location

查看:453
本文介绍了ARKit根据触摸位置设置ARAnchor变换的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用XCode 9上的AR入门应用程序,该应用程序在水龙头上的场景中创建了锚点:

I'm playing around with the AR starter app on XCode 9 where anchors are created in a scene on tap:

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
  guard let sceneView = self.view as? ARSKView else {
    return
  }

  // Create anchor using the camera’s current position
  if let currentFrame = sceneView.session.currentFrame {

    // Create a transform with a translation of 0.2 meters in front     
    // of the camera
    var translation = matrix_identity_float4x4
    translation.columns.3.z = -0.2
    let transform = simd_mul(currentFrame.camera.transform, translation)

    // Add a new anchor to the session
    let anchor = ARAnchor(transform: transform)
    sceneView.session.add(anchor: anchor)
  }
}

这总是会导致在屏幕中间创建锚点,而不管我轻按的位置如何,这很有意义,因为我们正在获取当前的相机变换,并且仅在z轴上应用平移.

This always results in the anchor being created in the middle of the screen regardless of where I tap, which makes sense, as we're getting the current camera transform and applying only a translation in the z axis to it.

我希望将锚放置在我实际用手指轻敲的位置.我可以使用touches.first?.location(in: sceneView)来获取拍子的位置,即与屏幕左上角的距离(以磅为单位),但是我不确定如何将这些2D pt坐标应用于以米为单位的锚点变换,或者轴.

I would like the anchor instead to be placed where I actually tapped with my finger. I can get the location of the tap using touches.first?.location(in: sceneView), i.e., just the distance from the top left corner of the screen in points, but I'm unsure how to apply these 2D pt coordinates to the anchor transform in meters, nor which axis they apply to.

推荐答案

我假设您是指Apple的ARKitExample项目将虚拟对象放入增强现实中".

I assume you're referring to Apple's ARKitExample project "Placing Virtual Objects in Augmented Reality".

看看在屏幕上移动(已放置)模型时调用的方法VirtualObject.translateBasedOnScreenPos(_:instantly:infinitePlane:),该方法基本上需要解决您描述的相同问题.

Have a look at the method VirtualObject.translateBasedOnScreenPos(_:instantly:infinitePlane:) that is being called when you're moving an (already placed) model in the screen—that basically needs to solve the same problems that you describe.

您会发现这依次称为ViewController.worldPositionFromScreenPosition(_:objectPos:infinitePlane:).

从这种方法中提取出来的方法是:

Extracted from this method, their approach is:

  1. 始终首先对现有的平面锚进行碰撞测试. (如果存在任何此类锚点,并且仅在其范围内.)

  1. Always do a hit test against exisiting plane anchors first. (If any such anchors exist & only within their extents.)

通过对特征点云进行命中测试来收集有关环境的更多信息,但尚未返回结果.

Collect more information about the environment by hit testing against the feature point cloud, but do not return the result yet.

如果需要或必要(没有好的功能命中测试结果):在无限的水平平面上进行命中测试(忽略现实世界).

If desired or necessary (no good feature hit test result): Hit test against an infinite, horizontal plane (ignoring the real world).

(如果可用),如果针对无限平面的命中测试为,则返回针对高质量特征的命中测试的结果 跳过或没有击中无限平面.

If available, return the result of the hit test against high quality features if the hit tests against infinite planes were skipped or no infinite plane was hit.

作为最后的选择,对功能进行第二次未经过滤的命中测试.如果场景中没有要素,则结果 返回此处将为零.

As a last resort, perform a second, unfiltered hit test against features. If there are no features in the scene, the result returned here will be nil.

如您所见,他们考虑了可能适用于您的用例的各个方面.考虑研究和重用(部分)他们的方法.

As you can see, they consider various aspects that may-or-may-not apply to your use-case. Consider studying and re-using (parts of) their approach.

这篇关于ARKit根据触摸位置设置ARAnchor变换的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆