使用 CIFilter 进行渲染时,实时摄像机被拉伸.- 斯威夫特 4 [英] Live camera is getting stretched while rendering using CIFilter. - Swift 4

查看:20
本文介绍了使用 CIFilter 进行渲染时,实时摄像机被拉伸.- 斯威夫特 4的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在渲染时应用相机滤镜,我的代码是...

I want to apply camera filter while rendering, my code is...

func session(_ session: ARSession, didUpdate frame: ARFrame) {
let image = CIImage(cvPixelBuffer: frame.capturedImage)
        var r: CGFloat = 0, g: CGFloat = 0, b: CGFloat = 0, a: CGFloat = 0

        color.getRed(&r, green: &g, blue: &b, alpha: &a)
        filter.setDefaults()
        filter.setValue(image, forKey: kCIInputImageKey)
        filter.setValue(CIVector(x: r, y: 0, z: 0, w: 0), forKey: "inputRVector")
        filter.setValue(CIVector(x: 0, y: g, z: 0, w: 0), forKey: "inputGVector")
        filter.setValue(CIVector(x: 0, y: 0, z: b, w: 0), forKey: "inputBVector")
        filter.setValue(CIVector(x: 0, y: 0, z: 0, w: a), forKey: "inputAVector")

        if let result = filter.outputImage,
            let cgImage = context.createCGImage(result, from: result.extent) {
            sceneView.scene.background.contents = cgImage
            sceneView.scene.background.contentsTransform = SCNMatrix4MakeRotation(.pi / 2, 0, 0, 1)
        }
}

在运行时,输出会被拉伸.我附上了两张图片,

at runtime, output is getting stretched. I have attached two images,

  1. 普通相机渲染
  2. 使用滤镜相机渲染.

请帮我解决它,如果您提供任何演示代码或项目将有很大帮助.谢谢你.

Please help me to resolve it, it would be great help if you provide any demo code or project. Thank you.

推荐答案

我想详细说明一下 Juan Boero 的答案,虽然他说的是完全正确的,但我花了一些时间才想出解决方案,所以对于后来来这里寻找解决这个问题的具体方法的人就是我所拥有的:

I would like to elaborate on Juan Boero answer, while what he is saying is completely true, it took me some time to figure out the solution, so for the people who came here later looking for a concrete approach to this problem here is what I've got:

当您使用 ARFramecapturedImage 时,您会收到一个代表横向图像的 CVPixelBuffer(因为 iPhone 的相机就是这样拍摄的)).如果您尝试使用

When you use capturedImage of ARFrame you receive a CVPixelBuffer that is representing image oriented in landscape (because that's how iPhone's camera is taking it). If you're trying to transform this image to a normal orientation using

let transform = frame.displayTransform(
            for: screenOrientation,
            viewportSize: sceneView.bounds.size
).inverted()
let image = CIImage(cvPixelBuffer: frame.capturedImage).transformed(by: transform)

你会得到一个拉伸的图像,因为(至少在我的情况下)像素缓冲区的尺寸是 1920x1440 (width x height) 并且 sceneView.bounds.size375x812(高 x 宽).所以没有办法正常地将 1440x1920 放入 375x812 因为它们不兼容.

You will get a stretched image because (at lease in my case) pixel buffer's dimensions are 1920x1440 (width x height) and sceneView.bounds.size is 375x812 (height x width). So there is no way to normally fit a 1440x1920 into 375x812 because they are not compatible.

如果您只需要一张图像,您实际上可以做的是将转换应用于像素缓冲区的倒置尺寸:

What you can actually do if you just need an image is to apply the transformation to the inverted dimensions of the pixel buffer:

let width = CVPixelBufferGetWidth(frame.capturedImage)
let height = CVPixelBufferGetHeight(frame.capturedImage)
let transform = frame.displayTransform(
            for: screenOrientation,
            viewportSize: CGSize(width: height, height: width)
).inverted()
let image = CIImage(cvPixelBuffer: frame.capturedImage).transformed(by: transform)

这样您将获得具有正确尺寸的正确旋转图像.

This way you will get a correctly rotated image with correct dimensions.

然后你可以用它做任何你想做的事情,例如裁剪以使其适应场景视图.

Then you can do whatever you want with it, e.g. crop to aspect-fit it into the scene view.

这篇关于使用 CIFilter 进行渲染时,实时摄像机被拉伸.- 斯威夫特 4的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆