将CIFilter应用于视频文件并保存 [英] Applying a CIFilter to a Video File and Saving it
问题描述
是否有任何快速,轻量级的方式将 CIFilter
应用于视频?在提到它之前,我已经看了 GPUImage - 它看起来非常强大< s>魔法代码,但对于我正在尝试做的事情来说真的太过分了。
Is there any fast, lightweight-as-possible way to apply a CIFilter
to a video? Before it's mentioned, I have looked at GPUImage - it looks like very powerful magic code, but it's really overkill for what I'm trying to do.
基本上,我想
- 拍摄一个视频文件,比如存储在
/tmp/myVideoFile.mp4
- 将
CIFilter
应用于此视频文件 - 将视频文件保存到其他(或相同)位置,例如
/tmp/anotherVideoFile.mp4
- Take a video file, say stored at
/tmp/myVideoFile.mp4
- Apply a
CIFilter
to this video file - Save the video file to a different (or the same) location, say
/tmp/anotherVideoFile.mp4
我已经能够申请CIFilter了使用 AVPlayerItemVideoOutput
I've been able to apply a CIFilter to a video that's playing extremely easily and quickly using AVPlayerItemVideoOutput
let player = AVPlayer(playerItem: AVPlayerItem(asset: video))
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: nil)
player.currentItem?.addOutput(self.output)
player.play()
let displayLink = CADisplayLink(target: self, selector: #selector(self.displayLinkDidRefresh(_:)))
displayLink.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSRunLoopCommonModes)
func displayLinkDidRefresh(link: CADisplayLink){
let itemTime = output.itemTimeForHostTime(CACurrentMediaTime())
if output.hasNewPixelBufferForItemTime(itemTime){
if let pixelBuffer = output.copyPixelBufferForItemTime(itemTime, itemTimeForDisplay: nil){
let image = CIImage(CVPixelBuffer: pixelBuffer)
// apply filters to image
// display image
}
}
}
这很好用,但我一直有很多只是找到如何解决的最小问题将过滤器应用于已保存的视频文件。有一个选项,基本上只是做我上面做的,使用 AVPlayer
,播放视频,并在播放时从每一帧获取像素缓冲区,但这赢得了'适合在后台进行视频处理。我不认为用户会喜欢等待他们的视频用于过滤器的等待。
This works great, but I've been having a lot just the tiniest bit of trouble finding out how to apply a filter to an already saved video file. There is the option of basically just doing what I did above, using an AVPlayer
, playing the video, and getting the pixel buffer from every frame as it is played but this won't work for video processing in the background. I don't think users would appreciate having to wait as long as their video is for the filter to be applied.
在过度简化的代码中,我正在寻找对于这样的事情:
In way over-simplified code, I'm looking for something like this:
var newVideo = AVMutableAsset() // We'll just pretend like this is a thing
var originalVideo = AVAsset(url: NSURL(urlString: "/example/location.mp4"))
originalVideo.getAllFrames(){(pixelBuffer: CVPixelBuffer) -> Void in
let image = CIImage(CVPixelBuffer: pixelBuffer)
.imageByApplyingFilter("Filter", withInputParameters: [:])
newVideo.addFrame(image)
}
newVideo.exportTo(url: NSURL(urlString: "/this/isAnother/example.mp4"))
有没有办法快速(再次,不涉及GPUImage,理想情况下在iOS 7中工作)的方式将过滤器应用于视频文件然后保存?例如,这将采用已保存的视频,将其加载到 AVAsset
,应用 CIFilter
,然后保存新的视频到不同的位置。
Is there any way fast (again, not involving GPUImage, and ideally working in iOS 7) way to apply a filter to a video file and then save it? For example this would take a saved video, load it into an AVAsset
, apply a CIFilter
, and then save the new video to a different location.
推荐答案
在iOS 9 / OS X 10.11 / tvOS中,有一种方便的方法来应用 CIFilter
s到视频。它适用于 AVVideoComposition
,因此您可以将其用于播放和文件到文件的导入/导出。请参阅 AVVideoComposition.init(asset:applyingCIFiltersWithHandler :) $方法文档的c $ c>
。
In iOS 9 / OS X 10.11 / tvOS, there's a convenience method for applying CIFilter
s to video. It works on an AVVideoComposition
, so you can use it both for playback and for file-to-file import/export. See AVVideoComposition.init(asset:applyingCIFiltersWithHandler:)
for the method docs.
There's an example in Apple's Core Image Programming Guide, too:
let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
// Clamp to avoid blurring transparent pixels at the image edges
let source = request.sourceImage.clampingToExtent()
filter.setValue(source, forKey: kCIInputImageKey)
// Vary filter parameters based on video timing
let seconds = CMTimeGetSeconds(request.compositionTime)
filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)
// Crop the blurred output to the bounds of the original image
let output = filter.outputImage!.cropping(to: request.sourceImage.extent)
// Provide the filter output to the composition
request.finish(with: output, context: nil)
})
该部分设置构图。完成后,您可以通过将其分配给 AVPlayer
来播放它,也可以将其写入带有 AVAssetExportSession $ c $的文件中。 C>。由于你是在后者之后,这是一个例子:
That part sets up the composition. After you've done that, you can either play it by assigning it to an AVPlayer
or write it to a file with AVAssetExportSession
. Since you're after the latter, here's an example of that:
let export = AVAssetExportSession(asset: asset, presetName: AVAssetExportPreset1920x1200)
export.outputFileType = AVFileTypeQuickTimeMovie
export.outputURL = outURL
export.videoComposition = composition
export.exportAsynchronouslyWithCompletionHandler(/*...*/)
还有一点关于这个,开始大约20分钟。
如果你想要一个适用于早期操作系统的解决方案,它会有点复杂。
If you want a solution that works on earlier OS, it's a bit more complicated.
旁白:想想你真正需要支持多久。 截至2016年8月15日,87%的设备使用iOS 9.0或更高版本,和97%在
iOS 8.0或更高版本上。付出很多努力来支持您的潜在客户群的一小部分 - 当您完成项目并准备部署时它会变得更小 - 可能不值得花费。
Aside: Think about how far back you really need to support. As of August 15, 2016, 87% of devices are on iOS 9.0 or later, and 97% are on iOS 8.0 or later. Going to a lot of effort to support a small slice of your potential customer base—and it'll get even smaller by the time you get your project done and ready to deploy—might not be worth the cost.
有几种方法可以解决这个问题。无论哪种方式,您将获得表示源帧的 CVPixelBuffer
,从他们创建 CIImage
,应用过滤器和渲染出新的 CVPixelBuffer
s 。
There are a couple of ways to go at this. Either way, you'll be getting CVPixelBuffer
s representing source frames, creating CIImage
s from them, applying filters, and rendering out new CVPixelBuffer
s.
-
使用
AVAssetReader
和AVAssetWriter
来读取和写入像素缓冲区。在导出章节。
Use
AVAssetReader
andAVAssetWriter
to read and write pixel buffers. There's examples for how to do this (the reading and writing part; you still need to do the filtering in between) in the Export chapter of Apple's AVFoundation Programming Guide.
使用自定义合成器类 AVVideoComposition
。您的自定义合成器已 AVAsynchronousVideoCompositionRequest
提供对像素缓冲区的访问的对象以及提供已处理像素缓冲区的方法。 Apple有一个名为 AVCustomEdit 显示了如何执行此操作(再次,只是获取和返回样本缓冲区部分;您希望使用Core Image处理而不是使用其GL渲染器)。
Use AVVideoComposition
with a custom compositor class. Your custom compositor is given AVAsynchronousVideoCompositionRequest
objects that provide access to pixel buffers and a way for you to provide processed pixel buffers. Apple has a sample code project called AVCustomEdit that shows how to do this (again, just the getting and returning sample buffers part; you'd want to process with Core Image instead of using their GL renderers).
在这两者中, AVVideoComposition
路线更灵活,因为您可以使用合成进行播放和导出。
Of those two, the AVVideoComposition
route is more flexible, because you can use a composition both for playback and export.
这篇关于将CIFilter应用于视频文件并保存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!