RealityKit 中使用的相机的真实焦距是多少? [英] What is the real Focal Length of the camera used in RealityKit?

查看:26
本文介绍了RealityKit 中使用的相机的真实焦距是多少?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我从 Xcode 的默认 AR 项目开始做这个增强现实项目.

我需要知道 ARKit 使用的相机的焦距.

据苹果称,

<块引用>

值 fx 和 fy 是像素焦距,对于方形像素.值 ox 和 oy 是主体的偏移量从图像框的左上角指向.所有值都是以像素表示.

我得到的 fxfy 的数字相同,即 1515.481.

要获得以毫米为单位的真实焦距,

  1. 正如您在 ARKit 中所说,有一个 3x3 相机矩阵,可让您在 2D 相机平面和 3D 世界坐标空间之间进行转换.

    var 内在函数:simd_float3x3 { get }

    使用这个矩阵你可以打印 4 个重要参数:fxfyoxoy.让我们把它们全部打印出来:

    DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {打印(焦距:\(self.sceneView.pointOfView?.camera?.focalLength)")打印(传感器高度:\(self.sceneView.pointOfView?.camera?.sensorHeight)")//传感器高度 (mm)让框架 = self.sceneView.session.currentFrame//内部矩阵打印(内在 fx:\(帧?.camera.intrinsics.columns.0.x)")打印(内在 fy:\(帧?.camera.intrinsics.columns.1.y)")打印(内在牛:\(帧?.camera.intrinsics.columns.2.x)")打印(Intrinsics oy:\(框架?.camera.intrinsics.columns.2.y)")}

    对于 iPhone X,打印以下值:

    当您应用公式时,您会得到一个令人难以置信的结果(请继续阅读以找出原因).


    关于广角镜头和光学防抖

    <块引用>

    iPhone X 有两个图像传感器,两个摄像头模块都配备了光学图像稳定器 (OIS).广角镜头焦距为28毫米,光圈为f/1.8,而长焦镜头为56毫米,<代码>f/2.4.

    ARKit 和 RealityKit 使用广角镜头后模块.在 iPhone X 的情况下,它是一个 28 毫米镜头.但是打印值 focal length = 20.78 mm 怎么样,嗯?我认为 28 mm20.78 mm 值之间的差异是由于视频防抖占用了总图像面积的 25% 左右.这样做是为了最终为最终图像获得 28 mm 的焦距值.

    红框是稳定阶段的裁剪边距.


    结论

    这是我自己的结论.我没有找到有关该主题的任何参考资料,因此如果我的观点有误(我承认可能是错误的),请不要严格评判我..

    众所周知,相机抖动会随着焦距的增加而放大.因此,焦距值越低,相机抖动越小.这对于 AR 应用程序中的无抖动高质量世界跟踪非常重要.此外,我坚信光学图像稳定器在焦距值较低的情况下效果更好.因此,ARKit 工程师为 AR 体验选择了较低的 focal length 值(捕获更宽的图像区域)也就不足为奇了,然后在稳定后,我们得到了图像的修改版本,例如它有 焦距 = 28 毫米.

    因此,以我的拙见,为 RealityKit 和 ARKit 计算真实的焦距是没有意义的,因为存在FAKE"焦距 已由 Apple 工程师实施,以提供强大的 AR 体验.

    I am doing this Augmented Reality project starting from Xcode's default AR project.

    I need to know the focal length of the camera used by ARKit.

    This page defines Focal Length well:

    Focal length, usually represented in millimeters (mm), is the basic description of a photographic lens. It is not a measurement of the actual length of a lens, but a calculation of an optical distance from the point where light rays converge to form a sharp image of an object to the digital sensor or 35mm film at the focal plane in the camera. The focal length of a lens is determined when the lens is focused at infinity.

    Said that, Apple offers this camera matrix called intrinsics, defined as

    According to Apple,

    The values fx and fy are the pixel focal length, and are identical for square pixels. The values ox and oy are the offsets of the principal point from the top-left corner of the image frame. All values are expressed in pixels.

    I am getting the same number for fx and fy, that is 1515.481.

    To obtain the real focal length in millimeters,

    1. This page says I need to use this formula: F(mm) = F(pixels) * SensorWidth(mm) / ImageWidth (pixel) but I don't have the sensor dimensions.
    2. this other page says FC = fx/sx = fy/sy, where sx and sy are the image dimensions width and height, what I suppose will give me two numbers, because fx = fy... and this is back to square zero.

    On iPhone 11, ARCamera captures a frame with the following dimensions: 1920x1440, at least this number is reported by the property camera.imageResolution.

    In the name of mental sanity, is there a way to get the focal length of ARCamera used by RealityKit?

    解决方案

    ARKit and RealityKit does definitely have identical values of focal length parameter. That's because these two frameworks are supposed to work together. And although there's no focal length instance property for ARView at the moment, you can easily print in Console a focal length for ARSCNView or SCNView.

    @IBOutlet var sceneView: ARSCNView!
    
    sceneView.pointOfView?.camera?.focalLength
    

    However, take into account that ARKit, RealityKit and SceneKit frameworks don't use a screen resolution, they rather use a viewport size. A magnification factor for iPhones' viewports is usually 1/2 or 1/3.


    Intrinsic Camera Matrix

    As you said in ARKit there's a 3x3 camera matrix allowing you convert between the 2D camera plane and 3D world coordinate space.

    var intrinsics: simd_float3x3 { get }
    

    Using this matrix you can print 4 important parameters: fx, fy, ox and oy. Let's print them all:

    DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
                        
        print(" Focal Length: \(self.sceneView.pointOfView?.camera?.focalLength)")
        print("Sensor Height: \(self.sceneView.pointOfView?.camera?.sensorHeight)")
        // SENSOR HEIGHT IN mm
                        
        let frame = self.sceneView.session.currentFrame
    
        // INTRINSICS MATRIX
        print("Intrinsics fx: \(frame?.camera.intrinsics.columns.0.x)")
        print("Intrinsics fy: \(frame?.camera.intrinsics.columns.1.y)")
        print("Intrinsics ox: \(frame?.camera.intrinsics.columns.2.x)")
        print("Intrinsics oy: \(frame?.camera.intrinsics.columns.2.y)")
    }
    

    For iPhone X the following values are printed:

    When you apply your formulas you'll get a implausible result (read on to find out why).


    About Wide-Angle Lens and OIS

    The iPhone X has two image sensors, and both camera modules are equipped with an optical image stabilizer (OIS). The wide-angle lens offers a 28-millimeter focal length and an aperture of f/1.8, while the telephoto lens is 56 millimeters and f/2.4.

    ARKit and RealityKit use a wide-angle lens rear module. In iPhone X case it's a 28-mm lens. But what about printed value focal length = 20.78 mm, huh? I believe that the discrepancy between the value of 28 mm and 20.78 mm is due to the fact that video stabilization eats up about 25% of the total image area. This is done in order to eventually get a focal length's value of 28 mm for final image.

    Red frame is a cropping margin at stabilisation stage.


    Conclusion

    This is my own conclusion. I didn't find any reference materials on that subject, so do not judge me strictly if my opinion is wrong (I admit it may be).

    We all know a fact that camera shake is magnified with an increase in focal length. So, the lower value of focal length is, the less camera shake is. It's very important for non-jittering high-quality world tracking in AR app. Also, I firmly believe that Optical Image Stabilisers work much better with lower values of focal length. Hence, it's not a surprise that ARKit engineers have chosen a lower value of focal length for AR experience (capturing a wider image area), and then after stabilization, we get a modified version of the image, like it has focal length = 28 mm.

    So, in my humble opinion, it makes no sense to calculate a REAL focal length for RealityKit and ARKit 'cause there is a "FAKE" focal length already implemented by Apple engineers for a robust AR experience.

    这篇关于RealityKit 中使用的相机的真实焦距是多少?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆