如何将Vignette CIFilter应用于iOS中的实时摄像头提要? [英] How to apply a Vignette CIFilter to a live camera feed in iOS?

查看:175
本文介绍了如何将Vignette CIFilter应用于iOS中的实时摄像头提要?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当尝试在Metal和Core Image的帮助下将简单的小插图滤镜应用于iPhone6的原始摄影机提要时,我发现在MTKView

我遵循的方法是(MetalViewController.swift):

  1. 使用AVCaptureVideoDataOutputSampleBufferDelegate
  2. 获取原始相机输出
  3. 转换CMSampleBuffer> CVPixelBuffer> CGImage
  4. 使用此CGImage创建一个MTLTexture.

点号2和3在名为fillMTLTextureToStoreTheImageData

的方法中

  1. CIFilter应用于从MTKViewDelegate
  2. 中的MTLTexture获取的CIImage

     func draw(in view: MTKView) {

        if let currentDrawable = view.currentDrawable {
            let commandBuffer = self.commandQueue.makeCommandBuffer()

            if let myTexture = self.sourceTexture{

                let inputImage = CIImage(mtlTexture: myTexture, options: nil)

                self.vignetteEffect.setValue(inputImage, forKey: kCIInputImageKey)

                self.coreImageContext.render(self.vignetteEffect.outputImage!, to: currentDrawable.texture, commandBuffer: commandBuffer, bounds: inputImage!.extent, colorSpace: self.colorSpace)

                commandBuffer?.present(currentDrawable)

                commandBuffer?.commit()
            }
        }
    }
 

性能根本不是苹果在本文档中提到的:解决方案

您的步骤2太慢了,无法支持实时渲染...,看来您缺少了几个步骤.为了您的目的,通常可以:

设置:

  1. 创建CVPixelBuffer的池-使用CVPixelBufferPoolCreate
  2. 使用CVMetalTextureCacheCreate
  3. 创建金属纹理池

对于每帧:

  1. 转换CMSampleBuffer> CVPixelBuffer> CIImage
  2. 将该CIImage通过您的过滤器管道
  3. 将输出图像从步骤1中创建的池中渲染为CVPixelBuffer
  4. 使用CVMetalTextureCacheCreateTextureFromImage通过过滤的CVPixelBuffer创建金属质感

如果设置正确,所有这些步骤将确保您的图像数据保留在GPU上,而不是从GPU传输到CPU再回到GPU进行显示.

好消息是,所有这一切都在Apple

  • Apply a CIFilter to the CIImage fetched from the MTLTexture in the MTKViewDelegate
  •     func draw(in view: MTKView) {
    
            if let currentDrawable = view.currentDrawable {
                let commandBuffer = self.commandQueue.makeCommandBuffer()
    
                if let myTexture = self.sourceTexture{
    
                    let inputImage = CIImage(mtlTexture: myTexture, options: nil)
    
                    self.vignetteEffect.setValue(inputImage, forKey: kCIInputImageKey)
    
                    self.coreImageContext.render(self.vignetteEffect.outputImage!, to: currentDrawable.texture, commandBuffer: commandBuffer, bounds: inputImage!.extent, colorSpace: self.colorSpace)
    
                    commandBuffer?.present(currentDrawable)
    
                    commandBuffer?.commit()
                }
            }
        }
    

    The performance is not at all what Apple mentioned in this doc: https://developer.apple.com/library/archive/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-TPXREF101

    Am I missing something?

    解决方案

    Your step 2 is way too slow to support real-time rendering... and it looks like you're missing a couple of steps. For your purpose, you would typically:

    Setup:

    1. create a pool of CVPixelBuffer - using CVPixelBufferPoolCreate
    2. create a pool of metal textures using CVMetalTextureCacheCreate

    For each frame:

    1. convert CMSampleBuffer > CVPixelBuffer > CIImage
    2. Pass that CIImage through your filter pipeline
    3. render the output image into a CVPixelBuffer from the pool created in step 1
    4. use CVMetalTextureCacheCreateTextureFromImage to create a metal texture with your filtered CVPixelBuffer

    If setup correctly, all these steps will make sure your image data stays on the GPU, as opposed to travelling from GPU to CPU and back to GPU for display.

    The good news is all this is demoed in the AVCamPhotoFilter sample code from Apple https://developer.apple.com/library/archive/samplecode/AVCamPhotoFilter/Introduction/Intro.html#//apple_ref/doc/uid/TP40017556. In particular see the RosyCIRenderer class and its superclass FilterRenderer.

    这篇关于如何将Vignette CIFilter应用于iOS中的实时摄像头提要?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

    查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆