使用CIFilter在CALayer层次结构中渲染视频 [英] Rendering a video in a CALayer hierarchy using CIFilters

查看:276
本文介绍了使用CIFilter在CALayer层次结构中渲染视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在我的iOS应用程序的UI中,显示了一个CALayer s的复杂层次结构.这些层之一是AVPlayerLayer,它显示实时应用(使用AVVideoComposition(asset:, applyingCIFiltersWithHandler:))CIFilter的视频.

In the UI of my iOS app, I display a complex hierarchy of CALayers. One of these layers is a AVPlayerLayer that displays a video with CIFilters applied in real time (using AVVideoComposition(asset:, applyingCIFiltersWithHandler:)).

现在,我想将此图层合成导出到视频文件. AVFoundation中有两个似乎有用的工具:

Now I want to export this layer composition to a video file. There are two tools in AVFoundation that seem helpful:

A :AVVideoCompositionCoreAnimationTool允许在(可能是动画的)CALayer层次结构中呈现视频

A: AVVideoCompositionCoreAnimationTool which allows rendering a video inside a (possibly animated) CALayer hierarchy

B :AVVideoComposition(asset:, applyingCIFiltersWithHandler:)(我也在用户界面中使用过)将CIFilter应用于视频资产.

B: AVVideoComposition(asset:, applyingCIFiltersWithHandler:), which I also use in the UI, to apply CIFilters to a video asset.

但是,这两个工具不能同时使用:如果我启动结合了这些工具的AVAssetExportSession,则AVFoundation会抛出NSInvalidArgumentException:

However, these two tools cannot be used simultaneously: If I start an AVAssetExportSession that combines these tools, AVFoundation throws an NSInvalidArgumentException:

期望视频组成仅包含AVCoreImageFilterVideoCompositionInstruction

我尝试通过以下方法解决此限制:

I tried to workaround this limitation as follows:

解决方法1

1)使用AVAssetReaderAVAssetWriter

2)从资产读取器获取样本缓冲区并应用CIFilter,将结果保存在CGImage中.

2) Obtain the sample buffers from the asset reader and apply the CIFilter, save the result in a CGImage.

3)在层层次结构中将CGImage设置为视频层的content.现在,层层次结构看起来像最终视频的一帧.

3) Set the CGImage as the content of the video layer in the layer hierarchy. Now the layer hierarchy "looks like" one frame of the final video.

4)使用CVPixelBufferGetBaseAddress从资产写入器获取每个帧的CVPixelBuffer数据,并使用该数据创建CGContext.

4) Obtain the data of the CVPixelBuffer for each frame from the asset writer using CVPixelBufferGetBaseAddress and create a CGContext with that data.

5)使用CALayer.render(in ctx: CGContext)将图层渲染到该上下文.

5) Render my layer to that context using CALayer.render(in ctx: CGContext).

此设置有效,但速度非常慢-导出5秒钟的视频有时需要一分钟.看来CoreGraphics调用是这里的瓶颈(我想这是因为采用这种方法,合成发生在CPU上?)

This setup works, but is extremely slow - exporting a 5 second video sometimes takes a minute. It looks like the CoreGraphics calls are the bottleneck here (I guess that's because with this approach the composition happens on the CPU?)

解决方法2

另一种方法是分两个步骤执行此操作:首先,将源视频保存为已应用 B 中的文件的过滤器,然后使用该视频文件嵌入视频就像 A 中的图层组成.但是,由于它使用了两次通过,因此我认为这样做效率不高.

One other approach could be to do this in two steps: First, save the source video just with the filters applied to a file as in B, and then use that video file to embed the video in the layer composition as in A. However, as it uses two passes, I guess this isn't as efficient as it could be.

摘要

什么是将视频导出到文件的理想方法,最好是一次通过?如何同时使用CIFilterAVVideoCompositionCoreAnimationTool?是否可以在AVFoundation中结合这些工具的本地方法来建立管道"?

What is a good approach to export this video to a file, ideally in a single pass? How can I use CIFilters and AVVideoCompositionCoreAnimationTool simultaneously? Is there a native way to set up a "pipeline" in AVFoundation which combines these tools?

推荐答案

实现此目标的方法是使用自定义AVVideoCompositing.此对象使您可以组成(在这种情况下,请应用CIFilter)每个视频帧.

The way to achieve this is using a custom AVVideoCompositing. This object allows you to compose (in this case apply the CIFilter) each video frame.

以下是将CIPhotoEffectNoir效果应用于整个视频的示例实现:

Here's an example implementation that applies a CIPhotoEffectNoir effect to the whole video:

class VideoFilterCompositor: NSObject, AVVideoCompositing {

    var sourcePixelBufferAttributes: [String : Any]? = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA]
    var requiredPixelBufferAttributesForRenderContext: [String : Any] = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA]
    private var renderContext: AVVideoCompositionRenderContext?

    func renderContextChanged(_ newRenderContext: AVVideoCompositionRenderContext) {
        renderContext = newRenderContext
    }

    func cancelAllPendingVideoCompositionRequests() {
    }

    private let filter = CIFilter(name: "CIPhotoEffectNoir")!
    private let context = CIContext()
    func startRequest(_ asyncVideoCompositionRequest: AVAsynchronousVideoCompositionRequest) {
        guard let track = asyncVideoCompositionRequest.sourceTrackIDs.first?.int32Value, let frame = asyncVideoCompositionRequest.sourceFrame(byTrackID: track) else {
            asyncVideoCompositionRequest.finish(with: NSError(domain: "VideoFilterCompositor", code: 0, userInfo: nil))
            return
        }
        filter.setValue(CIImage(cvPixelBuffer: frame), forKey: kCIInputImageKey)
        if let outputImage = filter.outputImage, let outBuffer = renderContext?.newPixelBuffer() {
            context.render(outputImage, to: outBuffer)
            asyncVideoCompositionRequest.finish(withComposedVideoFrame: outBuffer)
        } else {
            asyncVideoCompositionRequest.finish(with: NSError(domain: "VideoFilterCompositor", code: 0, userInfo: nil))
        }
    }

}

如果需要在不同的时间使用不同的过滤器,则可以使用自定义的AVVideoCompositionInstructionProtocol,可以从AVAsynchronousVideoCompositionRequest

If you need to have different filters at different times, you can use custom AVVideoCompositionInstructionProtocol which you can get from the AVAsynchronousVideoCompositionRequest

接下来,您需要在AVMutableVideoComposition中使用它,因此:

Next, you need to use this with your AVMutableVideoComposition, so:

let videoComposition = AVMutableVideoComposition()
videoComposition.customVideoCompositorClass = VideoFilterCompositor.self
//Add your animator tool as usual
let animator = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: v, in: p)
videoComposition.animationTool = animator
//Finish setting up the composition

这样,您应该可以使用常规的AVAssetExportSession导出视频,并设置其videoComposition

With this, you should be able to export the video using a regular AVAssetExportSession, setting its videoComposition

这篇关于使用CIFilter在CALayer层次结构中渲染视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆