如何将CIFilter应用于UIView? [英] How to apply CIFilter to UIView?
问题描述
根据 Apple文档
, iOS
不支持 CALayer
的过滤器属性.当我使用将 CIFilter
应用于 UIView
的应用程序之一时,即Splice,Funimate和artisto的视频编辑器Videoshow FX.这意味着我们可以将 CIFilter
应用于 UIView
.
According to Apple docs,
filters property of CALayer
is not supported in iOS
. As i used one of the apps which are applying CIFilter
to UIView
i.e. Splice, Video Editor Videoshow FX for Funimate and artisto. That's means we can apply CIFilter
to UIView
.
我已经使用了 SCRecorder
库,并尝试通过 SCPlayer
和 SCFilterImageView
完成此任务.但是在应用 CIFilter
后播放视频时,我面临黑屏问题.因此,请帮助我完成此任务,以便我可以将 CIFilter
应用于 UIView
,也可以通过单击UIButton来更改过滤器.
I have used SCRecorder
library and try to get this task done by SCPlayer
and SCFilterImageView
. But i am facing black screen issue when video is playing after apply CIFilter
. So kindly help me to complete this task so that i can apply CIFilter
to UIView
and also can change the filter by clicking on a UIButton.
推荐答案
技术上准确的答案是 CIFilter
需要 CIImage
.您可以将 UIView
转换为 UIImage
,然后将其转换为 CIImage
,但是所有使用图像作为输入的CoreImage滤镜(一些生成新图像的对象)将CIImage用于输入和输出.
The technically accurate answer is that a CIFilter
requires a CIImage
. You can turn a UIView
into a UIImage
and then convert that into a CIImage
, but all CoreImage filters that use an image for input (there are some that generate a new image) use a `CIImage for input and output.
- 请注意,
CIImage
的原点位于左下角,而不是左上角.基本上,Y轴是翻转的. - 如果您动态使用CoreImage滤镜,请学习使用
GLKView
进行渲染-它使用GPU,而UIImageView
使用CPU. - 如果要测试过滤器,最好使用实际的设备.模拟器会给您非常较差的性能.我已经看到一个简单的模糊需要近一分钟的时间,而在设备上只需一秒钟的时间即可!
- Please note that the origin for a
CIImage
is bottom left, not top left. Basically the Y axis is flipped. - If you use CoreImage filters dynamically, learn to use a
GLKView
to render in - it uses the GPU where aUIImageView
uses the CPU. - If you want to test out a filter, it's best to use an actual device. The simulator will give you very poor performance. I've seen a simple blur take nearly a minute where on a device it will be a fraction of a second!
Let's say you have a UIView
that you wish to apply a CIPhotoEffectMono to. The steps to do this would be:
- 将
UIView
转换为CIImage
. - 应用过滤器,获得一个
CIImage
作为输出. - 使用
CIContext
创建CGImage
,然后将其转换为UIImage
.
- Convert the
UIView
into aCIImage
. - Apply the filter, getting a
CIImage
as output. - Use a
CIContext
to create aCGImage
and then convert that to aUIImage
.
这是一个 UIView
扩展名,它将视图和的所有子视图转换为 UIImage
:
Here's a UIView
extension that will convert the view and all it's subviews into a UIImage
:
extension UIView {
public func createImage() -> UIImage {
UIGraphicsBeginImageContextWithOptions(
CGSize(width: self.frame.width, height: self.frame.height), true, 1)
self.layer.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image!
}
}
将 UIImage
转换为 CIImage
是一行代码:
let ciInput = CIImage(image: myView.createImage)
这是一个将应用过滤器并返回 UIImage
的函数:
Here's a function that will apply the filter and return a UIImage
:
func convertImageToBW(image:UIImage) -> UIImage {
let filter = CIFilter(name: "CIPhotoEffectMono")
// convert UIImage to CIImage and set as input
let ciInput = CIImage(image: image)
filter?.setValue(ciInput, forKey: "inputImage")
// get output CIImage, render as CGImage first to retain proper UIImage scale
let ciOutput = filter?.outputImage
let ciContext = CIContext()
let cgImage = ciContext.createCGImage(ciOutput!, from: (ciOutput?.extent)!)
return UIImage(cgImage: cgImage!)
}
这篇关于如何将CIFilter应用于UIView?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!