如何在 RealityKit 中使用 Raycast 方法? [英] How to use Raycast methods in RealityKit?

查看:26
本文介绍了如何在 RealityKit 中使用 Raycast 方法?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

RealityKit 框架中检测交叉点的方法共有三种,但我不知道如何在我的项目中使用它.

1.

func raycast(origin: SIMD3,方向:SIMD3,长度:浮动,查询:CollisionCastQueryType,掩码:碰撞组,相对于:实体?)->[CollisionCastHit]

2.

func raycast(from: SIMD3,到:SIMD3<浮点>,查询:CollisionCastQueryType,掩码:碰撞组,相对于:实体?)->[CollisionCastHit]

3.

funcconvexCast(convexShape:ShapeResource,fromPosition: SIMD3,从方向:simd_quatf,toPosition: SIMD3,方向:simd_quatf,查询:CollisionCastQueryType,掩码:碰撞组,相对于:实体?)->[CollisionCastHit]

解决方案

Simple Ray-Casting

<块引用>

如果您想了解如何使用 Ray-Casting 方法将在 Reality Composer 中制作的模型放置到 RealityKit 场景(具有检测到的水平面)中,请使用以下代码:

import RealityKit导入 ARKit类视图控制器:UIViewController {@IBOutlet var arView:ARView!让场景=尝试!Experience.loadScene()@IBAction func onTap(_ sender: UITapGestureRecognizer) {scene.steelBox!.name = "包裹";让 tapLocation: CGPoint = sender.location(in: arView)让估计平面:ARRaycastQuery.Target = .estimatedPlane让对齐: ARRaycastQuery.TargetAlignment = .horizo​​ntal让结果:[ARRaycastResult] = arView.raycast(来自:tapLocation,允许:估计平面,对齐:对齐)守卫让rayCast:ARRaycastResult = result.first否则{返回}让锚=锚实体(世界:rayCast.worldTransform)anchor.addChild(场景)arView.scene.anchors.append(anchor)打印(光线投射)}}

注意类ARRaycastQuery.这个类来自 ARKit,而不是 RealityKit.

凸光线投射

<块引用>

raycast(from:to:query:mask:relativeTo:) 这样的凸光线投射方法是沿直线滑动凸面形状并在第一个交叉点处停止的操作与场景中的任何碰撞形状.场景 raycast() 方法对场景中具有 碰撞形状 的所有实体执行命中测试.没有碰撞形状的实体将被忽略.

您可以使用以下代码执行从开始位置到结束位置的凸射线投射:

import RealityKit让 startPosition: SIMD3= [0, 0, 0]let endPosition: SIMD3= [5, 5, 5]让查询: CollisionCastQueryType = .all让掩码:CollisionGroup = .all让光线投射:[CollisionCastHit] = arView.scene.raycast(from: startPosition,到:结束位置,查询:查询,面具:面具,相对:无)守卫让rayCast:CollisionCastHit = raycasts.first否则{返回}print(rayCast.distance)/* 从射线原点到命中的距离 */print(rayCast.entity.name)/* 被击中的实体名称 */

CollisionCastHit 结构是碰撞投射的命中结果,它存在于 RealityKit 的场景中.

附言

<块引用>

当您使用 raycast(from:to:query:mask:relativeTo:) 方法测量从相机到实体的距离时,方向,仅重要的是它在世界坐标中的位置.

There are three ways about Detecting Intersections in RealityKit framework, but I don't know how to use it in my project.

1.

func raycast(origin: SIMD3<Float>, 
          direction: SIMD3<Float>, 
             length: Float, 
              query: CollisionCastQueryType, 
               mask: CollisionGroup, 
         relativeTo: Entity?) -> [CollisionCastHit]

2.

func raycast(from: SIMD3<Float>, 
               to: SIMD3<Float>, 
            query: CollisionCastQueryType, 
             mask: CollisionGroup, 
       relativeTo: Entity?) -> [CollisionCastHit]

3.

func convexCast(convexShape: ShapeResource, 
               fromPosition: SIMD3<Float>, 
            fromOrientation: simd_quatf, 
                 toPosition: SIMD3<Float>, 
              toOrientation: simd_quatf, 
                      query: CollisionCastQueryType, 
                       mask: CollisionGroup, 
                 relativeTo: Entity?) -> [CollisionCastHit]

解决方案

Simple Ray-Casting

If you want to find out how to position a model made in Reality Composer into a RealityKit scene (that has a detected horizontal plane) using Ray-Casting method, use the following code:

import RealityKit
import ARKit

class ViewController: UIViewController {
    
    @IBOutlet var arView: ARView!
    let scene = try! Experience.loadScene()
    
    @IBAction func onTap(_ sender: UITapGestureRecognizer) {
        
        scene.steelBox!.name = "Parcel"
        
        let tapLocation: CGPoint = sender.location(in: arView)
        let estimatedPlane: ARRaycastQuery.Target = .estimatedPlane
        let alignment: ARRaycastQuery.TargetAlignment = .horizontal
                
        let result: [ARRaycastResult] = arView.raycast(from: tapLocation,
                                                   allowing: estimatedPlane,
                                                  alignment: alignment)
        
        guard let rayCast: ARRaycastResult = result.first
        else { return }
        
        let anchor = AnchorEntity(world: rayCast.worldTransform)
        anchor.addChild(scene)
        arView.scene.anchors.append(anchor)
        
        print(rayCast)
    }
}

Pay attention to a class ARRaycastQuery. This class comes from ARKit, not from RealityKit.

Convex-Ray-Casting

A Convex-Ray-Casting methods like raycast(from:to:query:mask:relativeTo:) is the op of swiping a convex shapes along a straight line and stopping at the very first intersection with any of the collision shape in the scene. Scene raycast() method performs a hit-tests against all entities with collision shapes in the scene. Entities without a collision shape are ignored.

You can use the following code to perform a convex-ray-cast from start position to end:

import RealityKit

let startPosition: SIMD3<Float> = [0, 0, 0]
let endPosition: SIMD3<Float> = [5, 5, 5]
let query: CollisionCastQueryType = .all
let mask: CollisionGroup = .all

let raycasts: [CollisionCastHit] = arView.scene.raycast(from: startPosition, 
                                                          to: endPosition, 
                                                       query: query,  
                                                        mask: mask, 
                                                  relativeTo: nil)

guard let rayCast: CollisionCastHit = raycasts.first
else { return }
    
print(rayCast.distance)      /* The distance from the ray origin to the hit */
print(rayCast.entity.name)   /* The entity's name that was hit              */

A CollisionCastHit structure is a hit result of a collision cast and it lives in RealityKit's scene.

P.S.

When you use raycast(from:to:query:mask:relativeTo:) method for measuring a distance from camera to entity it doesn't matter what an orientation of ARCamera is, it only matters what its position is in world coordinates.

这篇关于如何在 RealityKit 中使用 Raycast 方法?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆