如何使用ARMatteGenerator将CIFilter添加到MTLTexture? [英] How to add a CIFilter to MTLTexture Using ARMatteGenerator?

查看:160
本文介绍了如何使用ARMatteGenerator将CIFilter添加到MTLTexture?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在处理与使用ARMatteGenerator有关的示例项目.生成一个MTLTexture,可以将其用作人物遮挡技术中的遮罩遮罩.

I am working off of Apple's sample project related to using the ARMatteGenerator to generate a a MTLTexture that can be used as an occlusion matte in the people occlusion technology.

我想确定如何通过CIFilter运行生成的遮罩.在我的代码中,我像这样过滤"遮罩;

I would like to determine how I could run the generated matte through a CIFilter. In my code, I am "filtering" the matte like such;

func updateMatteTextures(commandBuffer: MTLCommandBuffer) {
    guard let currentFrame = session.currentFrame else {
        return
    }
    var targetImage: CIImage?
    alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer)
    dilatedDepthTexture = matteGenerator.generateDilatedDepth(from: currentFrame, commandBuffer: commandBuffer)
    targetImage = CIImage(mtlTexture: alphaTexture!, options: nil)
    monoAlphaCIFilter?.setValue(targetImage!, forKey: kCIInputImageKey)
    monoAlphaCIFilter?.setValue(CIColor.red, forKey: kCIInputColorKey)
    targetImage = (monoAlphaCIFilter?.outputImage)!
    let drawingBounds = CGRect(origin: .zero, size: CGSize(width: alphaTexture!.width, height: alphaTexture!.height))
    context.render(targetImage!, to: alphaTexture!, commandBuffer: commandBuffer, bounds: drawingBounds, colorSpace: CGColorSpaceCreateDeviceRGB())

}

当我合成遮罩纹理和背景时,没有应用于遮罩的过滤效果.这就是合成纹理的方式;

When I go to composite the matte texture and backgrounds, there is no filtering effect applied to the matte. This is how the textures are being composited;

func compositeImagesWithEncoder(renderEncoder: MTLRenderCommandEncoder) {
    guard let textureY = capturedImageTextureY, let textureCbCr = capturedImageTextureCbCr else {
        return
    }

    // Push a debug group allowing us to identify render commands in the GPU Frame Capture tool
    renderEncoder.pushDebugGroup("CompositePass")

    // Set render command encoder state
    renderEncoder.setCullMode(.none)
    renderEncoder.setRenderPipelineState(compositePipelineState)
    renderEncoder.setDepthStencilState(compositeDepthState)

    // Setup plane vertex buffers
    renderEncoder.setVertexBuffer(imagePlaneVertexBuffer, offset: 0, index: 0)
    renderEncoder.setVertexBuffer(scenePlaneVertexBuffer, offset: 0, index: 1)

    // Setup textures for the composite fragment shader
    renderEncoder.setFragmentBuffer(sharedUniformBuffer, offset: sharedUniformBufferOffset, index: Int(kBufferIndexSharedUniforms.rawValue))
    renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureY), index: 0)
    renderEncoder.setFragmentTexture(CVMetalTextureGetTexture(textureCbCr), index: 1)
    renderEncoder.setFragmentTexture(sceneColorTexture, index: 2)
    renderEncoder.setFragmentTexture(sceneDepthTexture, index: 3)
    renderEncoder.setFragmentTexture(alphaTexture, index: 4)
    renderEncoder.setFragmentTexture(dilatedDepthTexture, index: 5)

    // Draw final quad to display
    renderEncoder.drawPrimitives(type: .triangleStrip, vertexStart: 0, vertexCount: 4)
    renderEncoder.popDebugGroup()
}

如何将CIFilter仅应用于ARMatteGenerator生成的alphaTexture?

How could I apply the CIFilter to only the alphaTexture generated by the ARMatteGenerator?

推荐答案

我认为您不想将CIFilter应用于alphaTexture.我假设您正在使用Apple的在自定义渲染器中实现人遮挡示例代码.如果您观看了今年的将人们带入AR WWDC会议,使用ARMatteGenerator生成细分遮罩,这是使用alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer)完成的. alphaTextureMTLTexture,本质上是一个Alpha蒙版,用于在相机框架中检测到人的地方(即,人在那里时是完全不透明的,而在人不在时则是完全透明的).

I don't think you want to apply a CIFilter to the alphaTexture. I assume you're using Apple's Effecting People Occlusion in Custom Renderers sample code. If you watch this year's Bringing People into AR WWDC session, they talk about generating a segmentation matte using ARMatteGenerator, which is what is being done with alphaTexture = matteGenerator.generateMatte(from: currentFrame, commandBuffer: commandBuffer). alphaTexture is a MTLTexture that is essentially an alpha mask for where humans have been detected in the camera frame (i.e. complete opaque where a human is and completely transparent where a human is not).

向Alpha纹理添加滤镜不会滤除最终渲染的图像,而只会影响合成中使用的蒙版.如果您要实现您之前的问题中链接的视频,我建议您调整发生合成的金属着色器.在会话中,他们指出他们将dilatedDepthrenderedDepth进行比较,以查看是否应该从摄像机绘制虚拟内容或像素:

Adding a filter to the alpha texture won't filter the final rendered image but will simply affect the mask that is used in the compositing. If you're trying to achieve the video linked in your previous question, I would recommend adjusting the metal shader where the compositing occurs. In the session, they point out that they compare the dilatedDepth and the renderedDepth to see if they should draw virtual content or pixels from the camera:

fragment half4 customComposition(...) {
    half4 camera = cameraTexture.sample(s, in.uv);
    half4 rendered = renderedTexture.sample(s, in.uv);
    float renderedDepth = renderedDepthTexture.sample(s, in.uv);
    half4 scene = mix(rendered, camera, rendered.a);
    half matte = matteTexture.sample(s, in.uv);
    float dilatedDepth = dilatedDepthTexture.sample(s, in.uv);

    if (dilatedDepth < renderedDepth) { // People in front of rendered
        // mix together the virtual content and camera feed based on the alpha provided by the matte
        return mix(scene, camera, matte);
    } else {
        // People are not in front so just return the scene
        return scene
    }
}

不幸的是,这在示例代码中明显不同,但是修改起来还是很容易的.打开Shaders.metal.找到compositeImageFragmentShader函数.在函数结尾处,您将看到half4 occluderResult = mix(sceneColor, cameraColor, alpha);,这与我们在上面看到的mix(scene, camera, matte);基本上是相同的操作.我们正在根据分割遮罩决定是使用场景中的像素还是相机源中的像素.通过将cameraColor替换为代表颜色的half4,我们可以轻松地用任意rgba值替换摄像机图像像素.例如,我们可以使用half4(float4(0.0, 0.0, 1.0, 1.0))来绘制细分遮罩蓝色中的所有像素:

Unfortunately, this is done sightly differently in the sample code, but it's still fairly easy to modify. Open up Shaders.metal. Find the compositeImageFragmentShader function. Toward the end of the function you'll see half4 occluderResult = mix(sceneColor, cameraColor, alpha); This is essentially the same operation as mix(scene, camera, matte); that we saw above. We're deciding if we should use a pixel from the scene or a pixel from camera feed based on the segmentation matte. We can easily replace the camera image pixel with an arbitrary rgba value by replacing cameraColor with a half4 that represents a color. For example, we could use half4(float4(0.0, 0.0, 1.0, 1.0)) to paint all of the pixels within the segmentation matte blue:

…
// Replacing camera color with blue
half4 occluderResult = mix(sceneColor, half4(float4(0.0, 0.0, 1.0, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;

当然,您也可以应用其他效果.动态灰度静态非常容易实现.

Of course, you can apply other effects as well. Dynamic grayscale static is pretty easy to achieve.

compositeImageFragmentShader上方,添加:

float random(float offset, float2 tex_coord, float time) {
    // pick two numbers that are unlikely to repeat
    float2 non_repeating = float2(12.9898 * time, 78.233 * time);

    // multiply our texture coordinates by the non-repeating numbers, then add them together
    float sum = dot(tex_coord, non_repeating);

    // calculate the sine of our sum to get a range between -1 and 1
    float sine = sin(sum);

    // multiply the sine by a big, non-repeating number so that even a small change will result in a big color jump
    float huge_number = sine * 43758.5453 * offset;

    // get just the numbers after the decimal point
    float fraction = fract(huge_number);

    // send the result back to the caller
    return fraction;
}

(摘自@twostraws ShaderKit )

(taken from @twostraws ShaderKit)

然后将compositeImageFragmentShader修改为:

…
float randFloat = random(1.0, cameraTexCoord, rgb[0]);

half4 occluderResult = mix(sceneColor, half4(float4(randFloat, randFloat, randFloat, 1.0)), alpha);
half4 mattingResult = mix(sceneColor, occluderResult, showOccluder);
return mattingResult;

您应该得到:

最后,调试器似乎很难跟上该应用程序的步伐.对我来说,运行附加的Xcode时,该应用程序将在启动后立即冻结,但通常运行时会平滑运行.

Finally, the debugger seems to have a hard time keeping up with the app. For me, when running attached Xcode, the app would freeze shortly after launch, but was typically smooth when running on its own.

这篇关于如何使用ARMatteGenerator将CIFilter添加到MTLTexture?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆