如何将Vignette CIFilter应用于iOS中的实时摄像头提要? [英] How to apply a Vignette CIFilter to a live camera feed in iOS?
问题描述
当尝试在Metal和Core Image的帮助下将简单的小插图滤镜应用于iPhone6的原始摄影机提要时,我发现在MTKView
我遵循的方法是(MetalViewController.swift):
- 使用
AVCaptureVideoDataOutputSampleBufferDelegate
获取原始相机输出
- 转换
CMSampleBuffer
>CVPixelBuffer
>CGImage
- 使用此
CGImage
创建一个MTLTexture
.
点号2和3在名为fillMTLTextureToStoreTheImageData
- 将
CIFilter
应用于从MTKViewDelegate
中的
MTLTexture
获取的CIImage
func draw(in view: MTKView) {
if let currentDrawable = view.currentDrawable {
let commandBuffer = self.commandQueue.makeCommandBuffer()
if let myTexture = self.sourceTexture{
let inputImage = CIImage(mtlTexture: myTexture, options: nil)
self.vignetteEffect.setValue(inputImage, forKey: kCIInputImageKey)
self.coreImageContext.render(self.vignetteEffect.outputImage!, to: currentDrawable.texture, commandBuffer: commandBuffer, bounds: inputImage!.extent, colorSpace: self.colorSpace)
commandBuffer?.present(currentDrawable)
commandBuffer?.commit()
}
}
}
性能根本不是苹果在本文档中提到的:解决方案
您的步骤2太慢了,无法支持实时渲染...,看来您缺少了几个步骤.为了您的目的,通常可以:
设置:
- 创建
CVPixelBuffer
的池-使用CVPixelBufferPoolCreate
- 使用
CVMetalTextureCacheCreate
创建金属纹理池
对于每帧:
- 转换
CMSampleBuffer
>CVPixelBuffer
>CIImage
- 将该
CIImage
通过您的过滤器管道 - 将输出图像从步骤1中创建的池中渲染为
CVPixelBuffer
- 使用
CVMetalTextureCacheCreateTextureFromImage
通过过滤的CVPixelBuffer创建金属质感
如果设置正确,所有这些步骤将确保您的图像数据保留在GPU上,而不是从GPU传输到CPU再回到GPU进行显示.
好消息是,所有这一切都在Apple
The performance is not at all what Apple mentioned in this doc: https://developer.apple.com/library/archive/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-TPXREF101 Am I missing something? Your step 2 is way too slow to support real-time rendering... and it looks like you're missing a couple of steps. For your purpose, you would typically: Setup: For each frame: If setup correctly, all these steps will make sure your image data stays on the GPU, as opposed to travelling from GPU to CPU and back to GPU for display. The good news is all this is demoed in the AVCamPhotoFilter sample code from Apple https://developer.apple.com/library/archive/samplecode/AVCamPhotoFilter/Introduction/Intro.html#//apple_ref/doc/uid/TP40017556. In particular see the 这篇关于如何将Vignette CIFilter应用于iOS中的实时摄像头提要?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!CIFilter
to the CIImage
fetched from the MTLTexture
in the MTKViewDelegate
func draw(in view: MTKView) {
if let currentDrawable = view.currentDrawable {
let commandBuffer = self.commandQueue.makeCommandBuffer()
if let myTexture = self.sourceTexture{
let inputImage = CIImage(mtlTexture: myTexture, options: nil)
self.vignetteEffect.setValue(inputImage, forKey: kCIInputImageKey)
self.coreImageContext.render(self.vignetteEffect.outputImage!, to: currentDrawable.texture, commandBuffer: commandBuffer, bounds: inputImage!.extent, colorSpace: self.colorSpace)
commandBuffer?.present(currentDrawable)
commandBuffer?.commit()
}
}
}
CVPixelBuffer
- using CVPixelBufferPoolCreate
CVMetalTextureCacheCreate
CMSampleBuffer
> CVPixelBuffer
> CIImage
CIImage
through your filter pipelineCVPixelBuffer
from the pool created in step 1CVMetalTextureCacheCreateTextureFromImage
to create a metal texture with your filtered CVPixelBufferRosyCIRenderer
class and its superclass FilterRenderer
.