使用 Metal 对 SceneKit 渲染进行抗锯齿处理 [英] Antialiasing a SceneKit rendering with Metal

查看:187
本文介绍了使用 Metal 对 SceneKit 渲染进行抗锯齿处理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是 Metal 的新手.我正在使用 此 Apple 示例代码渲染带有 Metal 的 SceneKit 场景.TLDR;它调用 SCNRenderer's render 函数并传入命令缓冲区.我正在为 Big Sur 编译.

它有效,但没有抗锯齿.我已经尝试了几种方法来实现它,您可以在下面的更新中看到.

如果没有 Metal,我只需在 SCNRenderer 上将 isJitteringEnabled 设置为 true,然后我就可以得到漂亮(而且速度很慢)的 96-ish-pass 渲染.如果我尝试使用 Metal 执行此操作,则会出现奇怪的像素格式不匹配,因此我怀疑两者不兼容.

使用 Metal,据我所知,实现抗锯齿的最简单方法是在渲染管道中启用多重采样(我知道如何做到这一点)——使用多重采样纹理 (MTLTextureType.type2DMultisample).这个部分答案支持我的假设.

这就是问题所在.当我从 CVMetalTextureCacheCVMetalTextureCacheCreateTextureFromImage 获取纹理时,我不知道如何更改纹理类型.这似乎是 Core Video 对 Metal 支持的限制?

我的完整源代码在这里

就是这样.这篇文章的其余部分是关于我尝试过的东西的更多细节.

(我认为这可能使用着色器实现.我也愿意接受该解决方案,但我不知道从哪里开始.这个例子不能编译,此示例 适用于 GSLS)


我的像素缓冲区 atts 看起来像这样

 让 pixelbufferAttributes = [kCVPixelBufferPixelFormatTypeKey : kCVPixelFormatType_32BGRA,kCVPixelBufferWidthKey:exportSettings.width,kCVPixelBufferHeightKey : exportSettings.height,kCVPixelBufferMetalCompatibilityKey: true] as [String: Any]

对于每一帧,它从池中创建一个新的像素缓冲区,将其包装在缓存中的 Metal 纹理中,就像这样

 let pixelFormat = MTLPixelFormat.bgra8Unorm_srgbvar optionalMetalTexture: CVMMetalTexture?错误 = CVMMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault,metalTextureCache,//对象属性像素缓冲区,nil,//纹理属性像素格式,exportSettings.width,exportSettings.height,0,//平面索引& optional MetalTexture)守卫错误== noErr,让metalTexture = optionalMetalTexture else {致命错误(无法从像素缓冲区创建金属纹理包装器\(错误)")}

尝试:更改纹理描述符

由于我使用 CVMetalTextureCacheCreateTextureFromImageCVPixelbuffer 创建我的 Metal 纹理,我无法弄清楚如何设置它的属性并使其成为多样本.>

尝试:尝试 H264

没有改变任何东西.还尝试仅更改 alpha 质量,使用带有 alpha 的 HEVC,但没有更改.

尝试:启用多重采样

我能够让我的管道接收到我想要的多重采样,但由于没有为多重采样设置纹理(更准确地说是 .2DMultisample 类型的 MTLTexture(docs)

尝试:复制Core Video创建的MTLTexture

我尝试使用 MTLBlitCommandEncoder 将 Core Video 提供的纹理复制到我使用正确属性设置的纹理中.但它崩溃了,告诉我属性不匹配.

我开始认为没有解决方案?

解决方案

启用多重采样是正确的想法.以下补丁显示了如何启用它.

--- a/HEVC-Videos-With-Alpha-AssetWriting/HEVC-Videos-With-Alpha-AssetWriting/AppDelegate.swift+++ b/HEVC-Videos-With-Alpha-AssetWriting/HEVC-Videos-With-Alpha-AssetWriting/AppDelegate.swift@@ -32,6 +32,8 @@ class AppDelegate: NSObject, NSApplicationDelegate, SCNSceneRendererDelegate {让渲染器 = SCNRenderer(设备:无,选项:无)var灯材料:SCNNode!var metalTextureCache:CVMetalTextureCache!+ 让 mSaaSampleCount = 1+ var metalMultisampledTexture:MTLTexture!//出口var frameCounter = 0@@ -61,6 +63,18 @@ class AppDelegate: NSObject, NSApplicationDelegate, SCNSceneRendererDelegate {致命错误(无法创建金属纹理缓存:\(错误)")}metalTextureCache = optionalMetalTextureCache++ 如果 (mSaaSampleCount > 1) {+ let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: MTLPixelFormat.bgra8Unorm_srgb,+ 宽度:ExportSettings.width,+ 高度:ExportSettings.height,+ mipmapped:假)+ textureDescriptor.usage = .renderTarget+ textureDescriptor.storageMode = .private+ textureDescriptor.textureType = .type2DMultisample+ textureDescriptor.sampleCount = mSaaSampleCount+ metalMultisampledTexture = renderer.device!.makeTexture(descriptor: textureDescriptor)+ }}///渲染下一帧并调用帧完成处理程序@@ -106,7 +120,14 @@ class AppDelegate: NSObject, NSApplicationDelegate, SCNSceneRendererDelegate {让 renderPassDescriptor = MTLRenderPassDescriptor()renderPassDescriptor.colorAttachments[0].loadAction = .clearrenderPassDescriptor.colorAttachments[0].clearColor = clearColor- renderPassDescriptor.colorAttachments[0].texture = CVMetalTextureGetTexture(metalTexture)+ 如果 (mSaaSampleCount > 1) {+ renderPassDescriptor.colorAttachments[0].texture = metalMultisampledTexture+ renderPassDescriptor.colorAttachments[0].resolveTexture = CVMetalTextureGetTexture(metalTexture)+ renderPassDescriptor.colorAttachments[0].storeAction = .multisampleResolve+ }+ 其他 {+ renderPassDescriptor.colorAttachments[0].texture = CVMetalTextureGetTexture(metalTexture)+ }renderer.render(atTime: currentPresentationTime.seconds,视口:ExportSettings.viewport,命令缓冲区:命令缓冲区,

I'm new to Metal. I'm rendering a SceneKit scene with Metal using this Apple sample code. TLDR; it calls the SCNRenderer's render function and passes in a command buffer. I'm compiling for Big Sur.

It works, but it is not anti-aliased. I've tried a few ways to achieve it, as you can see in the updates below.

Without Metal, I'd just set isJitteringEnabled to true on the SCNRenderer, and I get beautiful (and slow) 96-ish-pass renderings. If I try to do this with Metal, I get weird pixel format mismatches, so I'm suspecting the two just aren't compatible.

With Metal, as far as I can tell, the simplest way to achieve antialiasing is to enable multi-sampling in the render pipeline (I know how to do that) — and use a multi sampling texture (MTLTextureType.type2DMultisample). This partial answer backs up my assumption.

And that's the problem. I don't know how to change the texture type when I get my texture from CVMetalTextureCache and CVMetalTextureCacheCreateTextureFromImage. It seems this is a limitation in Core Video's Metal support?

My full source is here

That's it. The rest of this post is more details on the stuff I tried.

(I think this might be possible using a shader. I'm open to that solution as well, but I don't know where to start. This example doesn't compile, and this example is for GSLS)


My pixel buffer atts look like this

        let pixelbufferAttributes = [
            kCVPixelBufferPixelFormatTypeKey : kCVPixelFormatType_32BGRA,
            kCVPixelBufferWidthKey: exportSettings.width,
            kCVPixelBufferHeightKey : exportSettings.height,
        kCVPixelBufferMetalCompatibilityKey: true] as [String: Any]

For each frame, it creates a new pixel buffer from the pool, wraps it in a Metal texture from a cache, like this

        let pixelFormat = MTLPixelFormat.bgra8Unorm_srgb
        var optionalMetalTexture: CVMetalTexture?
        err = CVMetalTextureCacheCreateTextureFromImage(
            kCFAllocatorDefault,
            metalTextureCache, // object prop
            pixelBuffer,
            nil, // texture attributes
            pixelFormat,
            exportSettings.width,
            exportSettings.height,
            0, // planeIndex
            &optionalMetalTexture)
        guard err == noErr, let metalTexture = optionalMetalTexture else {
            fatalError("Failed to create metal texture wrapper from pixel bufffer \(err)")
        }

Attempt: Change the texture descriptor

Since I'm creating my Metal texture from a CVPixelbuffer with CVMetalTextureCacheCreateTextureFromImage, I can't figure out how to set its attributes and make it multi sample.

Attempt: Try H264

Didn't change anything. Also tried changing just the alpha quality, with HEVC with alpha, but no change.

Attempt: Enable multi sampling

I was able to get my pipeline to pick up that I wanted multi sampling, but it crashes due to the texture not being set up for multisampling (more precisely a MTLTexture of type .2DMultisample (docs)

Attempt: Copy the MTLTexture created by Core Video

I tried to use a MTLBlitCommandEncoder to copy the texture I was given by Core Video into a texture I had set up with the right attributes. But it crashes telling me that the attributes don't match.

I'm starting to think there's no solution to this?

解决方案

Enabling multisampling was the right idea. The following patch shows how to enable it.

--- a/HEVC-Videos-With-Alpha-AssetWriting/HEVC-Videos-With-Alpha-AssetWriting/AppDelegate.swift
+++ b/HEVC-Videos-With-Alpha-AssetWriting/HEVC-Videos-With-Alpha-AssetWriting/AppDelegate.swift
@@ -32,6 +32,8 @@ class AppDelegate: NSObject, NSApplicationDelegate, SCNSceneRendererDelegate {
     let renderer = SCNRenderer(device: nil, options: nil)
     var lampMaterials: SCNNode!
     var metalTextureCache: CVMetalTextureCache!
+    let msaaSampleCount = 1
+    var metalMultisampledTexture: MTLTexture!
     
     // Export
     var frameCounter = 0
@@ -61,6 +63,18 @@ class AppDelegate: NSObject, NSApplicationDelegate, SCNSceneRendererDelegate {
             fatalError("Cannot create metal texture cache: \(err)")
         }
         metalTextureCache = optionalMetalTextureCache
+        
+        if (msaaSampleCount > 1) {
+            let textureDescriptor = MTLTextureDescriptor.texture2DDescriptor(pixelFormat: MTLPixelFormat.bgra8Unorm_srgb,
+                                                                             width: ExportSettings.width,
+                                                                             height: ExportSettings.height,
+                                                                             mipmapped: false)
+            textureDescriptor.usage = .renderTarget
+            textureDescriptor.storageMode = .private
+            textureDescriptor.textureType = .type2DMultisample
+            textureDescriptor.sampleCount = msaaSampleCount
+            metalMultisampledTexture = renderer.device!.makeTexture(descriptor: textureDescriptor)
+        }
     }
     
     /// Render next frame and call the frame completion handler
@@ -106,7 +120,14 @@ class AppDelegate: NSObject, NSApplicationDelegate, SCNSceneRendererDelegate {
         let renderPassDescriptor = MTLRenderPassDescriptor()
         renderPassDescriptor.colorAttachments[0].loadAction = .clear
         renderPassDescriptor.colorAttachments[0].clearColor = clearColor
-        renderPassDescriptor.colorAttachments[0].texture = CVMetalTextureGetTexture(metalTexture)
+        if (msaaSampleCount > 1) {
+            renderPassDescriptor.colorAttachments[0].texture = metalMultisampledTexture
+            renderPassDescriptor.colorAttachments[0].resolveTexture = CVMetalTextureGetTexture(metalTexture)
+            renderPassDescriptor.colorAttachments[0].storeAction = .multisampleResolve
+        }
+        else {
+            renderPassDescriptor.colorAttachments[0].texture = CVMetalTextureGetTexture(metalTexture)
+        }
         renderer.render(atTime: currentPresentationTime.seconds,
                         viewport: ExportSettings.viewport,
                         commandBuffer: commandBuffer,

这篇关于使用 Metal 对 SceneKit 渲染进行抗锯齿处理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆