iOS 10打破了自定义CIFilter [英] iOS 10 breaks custom CIFilter

查看:95
本文介绍了iOS 10打破了自定义CIFilter的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我编写了一个抠像滤镜,使 MPEG 电影的背景透明,这样您就可以将电影文件用于更长的动画,而无需冗长的 PNG 序列.(就像某些类型的iOS动画通常所做的那样.)

I have written a chromakey filter for making the backgrounds of MPEG movies transparent so that you can use a movie file for longer animations without the need for lengthy sequences of PNGs (as is commonly done for some types of iOS animations).

我正在使用 AVPlayer AVVideoComposition 和自定义的 CIFilter 在背景图像上渲染视频.背景图片可以通过与应用进行交互的用户进行动态更改.

I am using AVPlayer, AVVideoComposition, and a custom CIFilter to render the video over a background image. The background image can be changed dynamically by the user interacting with the app.

在iOS 10推出之前,它一直都可以正常工作,现在它已经坏了.

This used to work just fine until iOS 10 came out and now it is broken.

现在发生的事情是视频正在播放,但是没有色度键发生,并且Xcode反复吐出以下错误:

What happens now is that the video plays, but no chroma keying is occurring and Xcode repeatedly spits out the following error:

need a swizzler so that YCC420v can be written.

以下是 CIFilter 应该产生的图像:

Here's an image of what the CIFilter should produce:

这是它产生的结果(自iOS 10起):

And instead this is what it produces (since iOS 10):

这是我的代码部分,用于创建 EAGLContext 并应用自定义的 CIFilter :

Here's the section of my code that creates the EAGLContext and applies the custom CIFilter:

    let myEAGLContext = EAGLContext.init(API: EAGLRenderingAPI.OpenGLES2)
    //let cicontext = CIContext.init(EAGLContext: myEAGLContext, options: [kCIContextWorkingColorSpace: NSNull()])
    let cicontext = CIContext.init(EAGLContext: myEAGLContext)

    let filter = ChromaKeyFilter()
    filter.activeColor = CIColor.init(red: 0, green:1.0, blue: 0.0)
    filter.threshold = self.threshold

    //most of below comes from the "WWDC15 What's New In Core Image" slides
    let vidComp = AVVideoComposition(asset: videoAsset!,
                                     applyingCIFiltersWithHandler:
        {
            request in
            let input = request.sourceImage.imageByClampingToExtent()

            filter.inputImage = input

            let output = filter.outputImage!.imageByClampingToExtent()
            request.finishWithImage(output, context: cicontext)
            self.reloadInputViews()

    })

    let playerItem = AVPlayerItem(asset: videoAsset!)
    playerItem.videoComposition = vidComp
    self.player = AVPlayer(playerItem: playerItem)
    self.playerInitialized = true
    let layer = AVPlayerLayer(player: player)

    self.subviews.forEach { subview in
        subview.removeFromSuperview()
    }

    layer.frame = CGRect(x: 0.0, y: 0.0, width: self.frame.size.width, height: self.frame.size.height)
    self.layer.addSublayer(layer)

这是自定义 CIFilter 的代码:

private class ChromaKeyFilter : CIFilter {
private var kernel: CIColorKernel!
var inputImage: CIImage?
var activeColor = CIColor(red: 0.0, green: 1.0, blue: 0.0)
var threshold: Float = 0.05

override init() {
    super.init()
    kernel = createKernel()
}

required init(coder aDecoder: NSCoder) {
    super.init(coder: aDecoder)!
    kernel = createKernel()
}

override var outputImage: CIImage? {
    if let inputImage = inputImage {
        let dod = inputImage.extent
        let args = [inputImage as AnyObject, activeColor as AnyObject, threshold as AnyObject]
        return kernel.applyWithExtent(dod, arguments: args)
    }
    return nil
}

private func createKernel() -> CIColorKernel {
    let kernelString =
        "kernel vec4 chromaKey( __sample s, __color c, float threshold ) { \n" +
            //below kernel was adapted from the GPUImage custom chromakeyfilter:
            //https://github.com/BradLarson/GPUImage/blob/master/framework/Source/GPUImageChromaKeyFilter.m#L30
            "  float maskY = 0.2989 * c.r + 0.5866 * c.g + 0.1145 * c.b;\n" +
            "  float maskCr = 0.7132 * (c.r - maskY);\n" +
            "  float maskCb = 0.5647 * (c.b - maskY);\n" +
            "  float Y = 0.2989 * s.rgb.r + 0.5866 * s.rgb.g + 0.1145 * s.rgb.b;\n" +
            "  float Cr = 0.7132 * (s.rgb.r - Y);\n" +
            "  float Cb = 0.5647 * (s.rgba.b - Y);\n" +
            "  float blendValue = smoothstep(threshold, threshold + 0.5, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));\n" +
            "  return blendValue * vec4( s.rgb, 1.0 ); \n" +
    "}"
    let kernel = CIColorKernel(string: kernelString)
    return kernel!
}

}

任何人都对为什么现在才打破这个有一些想法吗?有趣的是,它只是在电话上坏了.它仍然可以在模拟器上运行,尽管比 iOS 10 发布之前的速度要慢得多.

Anybody have some ideas about why this is only now breaking? Interestingly, it is only broken on the phone. It still works on the simulator, albeit much much slower than it used to before iOS 10 came out.

推荐答案

iOS10(设备)管道的某些部分(播放器层?)似乎已切换到YUV.

It looks like some part (the player layer?) of the iOS10 (device) pipeline has switched to YUV.

将您的 AVPlayerLayer pixelBufferAttributes 设置为BGRA可修复缺少alpha的问题,并使记录的错误静音:

Setting your AVPlayerLayer's pixelBufferAttributes to BGRA fixes the lack of alpha and silences the logged error:

layer.pixelBufferAttributes = [kCVPixelBufferPixelFormatTypeKey as String: NSNumber(value: kCVPixelFormatType_32BGRA)]

这篇关于iOS 10打破了自定义CIFilter的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆