在AVFoundation中使用CIFilter(iOS) [英] Using CIFilter with AVFoundation (iOS)

查看:307
本文介绍了在AVFoundation中使用CIFilter(iOS)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将过滤器应用于在iOS上使用AVFoundation创建的视频合成(过滤器可能是,例如,模糊,像素化,棕褐色等)。我需要实时应用效果并能够将复合视频渲染到磁盘,但我很高兴从一个或另一个开始。



<不幸的是,我似乎无法想出这个。这就是我能做的事情:




  • 我可以为正在播放电影的UIView添加一个动画层,但是我不清楚我可以通过这种方式处理传入的视频图像。

  • 我可以向AVPlayerLayer添加一组CIFilter,但事实证明这些在iOS中被忽略(它只适用于Mac OS X) 。

  • 我可以将AVVideoCompositionCoreAnimationTool添加到AVVideoCompopsition中,但我不确定这会完成视频处理(而不是动画),并且它会因为没有为真实设计而设置的消息崩溃无论如何回放时间。我相信这是渲染到磁盘时渲染动画的解决方案。



其他应用程序执行此操作(我认为),所以我假设我我错过了一些明显的东西。



注意:我已经研究过GPUImage,我很乐意使用它,但它对电影效果不佳,尤其是带有音频的电影。例如:




解决方案

您可以使用 AVVideoCompositing AVAsynchronousVideoCompositionRequest 用于实现自定义合成器的协议。

  CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoComposi tionRequest sourceFrameByTrackID:trackID]; 
CIImage * theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage * motionBlurredImage = [[CIFilter * filterWithName:@CIMotionBlurkeysAndValues:@inputImage,theImage,nil] valueForKey:kCIOutputImageKey];
CIContext * someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];

然后使用OpenGL渲染像素缓冲区,如 Apple的文档。这将允许您实现所需的任意数量的转换或过滤器。然后,您可以设置AVAssetExportSession.videoCompostion,然后您就可以将合成视频导出到磁盘。


I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other.

Unfortunately, I can't seem to figure this one out. Here's what I can do:

  • I can add a layer for animation to the UIView that's playing the movie, but it's not clear to me if I can process the incoming video image this way.
  • I can add an array of CIFilters to the AVPlayerLayer, but it turns out these are ignored in iOS (it only works on Mac OS X).
  • I can add an AVVideoCompositionCoreAnimationTool to the AVVideoCompopsition, but I'm not sure this would accomplish video processing (rather than animation) and it crashes with a message about not being designed for real-time playback anyway. I believe this is the solution for rendering animation when rendering to disk.

Other apps do this (I think), so I assume I'm missing something obvious.

note: I've looked into GPUImage and I'd love to use it, but it just doesn't work well with movies, especially movies with audio. See for example:

解决方案

You could use the AVVideoCompositing and AVAsynchronousVideoCompositionRequest protocol to implement a custom compositor.

CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID:trackID];
CIImage *theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *motionBlurredImage = [[CIFilter *filterWithName:@"CIMotionBlur" keysAndValues:@"inputImage", theImage, nil] valueForKey:kCIOutputImageKey];
CIContext *someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];

Then render the pixel buffer using OpenGL as described in Apple's Documentation. This would allow you to implement any number of transitions or filters that you want. You can then set the AVAssetExportSession.videoCompostion and you will be able to export the composited video to disk.

这篇关于在AVFoundation中使用CIFilter(iOS)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆