LiDAR和RealityKit –捕获扫描模型的真实世界纹理 [英] LiDAR and RealityKit – Capture a Real World Texture for a Scanned Model

查看:83
本文介绍了LiDAR和RealityKit –捕获扫描模型的真实世界纹理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想 捕获 一个真实的纹理,并将其应用于借助LiDAR扫描仪生成的3D网格.我想应该使用Projection-View-Model矩阵.必须从固定的视点(例如从房间的中心)制作纹理.但是,如果我们可以应用场景中作为 cube-map 纹理收集的 environmentTexturing 数据,那将是一个理想的解决方案.

I would like to capture a real-world texture and apply it to a 3D mesh produced with a help of LiDAR scanner. I suppose that Projection-View-Model matrices should be used for that. A texture must be made from fixed Point-of-View, for example, from center of a room. However, it would be an ideal solution if we could apply an environmentTexturing data, collected as a cube-map texture in a scene.

查看 3D扫描仪应用.这是一个参考应用程序,允许我们导出带有纹理的模型.

Look at 3D Scanner App. It's a reference app allowing us to export a model with its texture.

我需要一次迭代捕获一个纹理.我不需要实时更新.我意识到,更改PoV会导致错误的纹理感知,换句话说,会导致纹理变形.我还意识到RealityKit中存在动态镶嵌,并且存在自动纹理贴图(纹理的分辨率取决于其捕获的距离).

I need to capture a texture with one iteration. I do not need to update it in a realtime. I realize that changing PoV leads to a wrong texture's perception, in other words, distortion of a texture. Also I realize that there's a dynamic tesselation in RealityKit and there's an automatic texture mipmapping (texture's resolution depends on a distance it captured from).

import RealityKit
import ARKit
import MetalKit
import ModelIO

class ViewController: UIViewController, ARSessionDelegate {
    
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        arView.session.delegate = self
        arView.debugOptions.insert(.showSceneUnderstanding)

        let config = ARWorldTrackingConfiguration()
        config.sceneReconstruction = .mesh
        config.environmentTexturing = .manual
        arView.session.run(config)
    }
}

问题

  • 如何为重建的3D网格捕获和应用真实世界的纹理?

  • 推荐答案

    以下是初步解决方案(不是最终解决方案):

    Here is a preliminary solution (it's not a final one):

    import MetalKit
    import ARKit
    
    /*  Color model YCbCr  */
    var capturedTextureChannelY: CVMetalTexture?      /*  Luma               */
    var capturedTextureChannelCbCr: CVMetalTexture?   /*  Chroma difference  */
    
    lazy var rgbUniforms: RGBUniforms = {
        var uniforms = RGBUniforms()
        uniforms.radius = rgbRadius
        uniforms.viewToCamera.copy(from: viewToCamera)
        uniforms.viewRatio = Float(viewportSize.width / viewportSize.height)
        return uniforms
    }()
    
    func updateTextures(frame: ARFrame) {
        let pixelBuffer = frame.capturedImage
        guard CVPixelBufferGetPlaneCount(pixelBuffer) >= 2 else { return }  
      
        capturedTextureChannelY = makeTexture(fromPixelBuffer: pixelBuffer, 
                                                  pixelFormat: .r8Unorm, 
                                                   planeIndex: 0)
        capturedTextureChannelCbCr = makeTexture(fromPixelBuffer: pixelBuffer, 
                                                     pixelFormat: .rg8Unorm, 
                                                      planeIndex: 1)
    }
    
    func makeTexture(fromPixelBuffer pixelBuffer: CVPixelBuffer, 
                                     pixelFormat: MTLPixelFormat, 
                                      planeIndex: Int) -> CVMetalTexture? {
    
        let width = CVPixelBufferGetWidthOfPlane(pixelBuffer, planeIndex)
        let height = CVPixelBufferGetHeightOfPlane(pixelBuffer, planeIndex)
        
        var texture: CVMetalTexture? = nil
        let status = CVMetalTextureCacheCreateTextureFromImage(nil, 
                                                               textureCache, 
                                                               pixelBuffer, 
                                                               nil, 
                                                               pixelFormat, 
                                                               width, 
                                                               height, 
                                                               planeIndex, 
                                                               &texture)
          
        if status != kCVReturnSuccess {
            texture = nil
        }
        return texture
    }
    
    func draw() {
        guard let currentFrame = session.currentFrame,
              let commandBuffer = commandQueue.makeCommandBuffer(),
              let renderDescriptor = renderDestination.currentRenderPassDescriptor,
              let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderDescriptor)
        else { return }
    
        self.updateTextures(frame: currentFrame)
        
        if rgbUniforms.radius > 0 {
            var retainingTextures = [capturedTextureChannelY, 
                                     capturedTextureChannelCbCr]
    
            commandBuffer.addCompletedHandler { buffer in
                retainingTextures.removeAll()
            }
    
            renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(capturedTextureChannelY!), 
                                                                      index: Int(kTextureY.rawValue))
            renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(capturedTextureChannelCbCr!), 
                                                                      index: Int(kTextureCbCr.rawValue))
            renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
        }
    }
    

    P.S.

    我在Apple Developer Forum上找到了名为配备了3D建模的LiDAR的帖子.它说:

    I found the post called LiDAR equipped for 3D modelling on Apple Developer Forum. It says:

    问题:

    Camera和LiDAR传感器可以一起工作以获得具有纹理的3D模型吗?

    Can Camera and LiDAR sensor work together to achieve a 3D model with texture?

    答案:

    是的(部分)可能.您可以将锚的任何几何体投影回相机图像中,以推断出纹理.但是,这需要多个观点和某种形式的高级逻辑,才能通过投影来决定将其应用于几何的哪一部分.

    Yes that is (partially) possible. You can project any geometry of an anchor back into the camera image to reason about the texture. However this requires multiple viewpoints and some form of higher-level logic to decide with projection to apply to which part of your geometry.

    Frameworks工程师

    Frameworks Engineer

    这篇关于LiDAR和RealityKit –捕获扫描模型的真实世界纹理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆