RealityKit中使用的相机的实际焦距是多少? [英] What is the real Focal Length of the camera used in RealityKit?
问题描述
我正在从Xcode的默认AR项目开始做这个增强现实项目.
I am doing this Augmented Reality project starting from Xcode's default AR project.
我需要知道ARKit使用的相机的焦距.
I need to know the focal length of the camera used by ARKit.
此页很好地定义了焦距":
This page defines Focal Length well:
焦距通常以毫米(mm)为单位 摄影镜头的说明.这不是衡量 镜片的实际长度,但计算出与镜片之间的光学距离 光线会聚形成物体清晰图像的点 到相机焦平面上的数字传感器或35mm胶片. 镜头的焦距是在将镜头聚焦于时确定的 无限.
Focal length, usually represented in millimeters (mm), is the basic description of a photographic lens. It is not a measurement of the actual length of a lens, but a calculation of an optical distance from the point where light rays converge to form a sharp image of an object to the digital sensor or 35mm film at the focal plane in the camera. The focal length of a lens is determined when the lens is focused at infinity.
说,苹果公司提供了这种相机矩阵,称为内在函数,定义为
Said that, Apple offers this camera matrix called intrinsics, defined as
根据苹果公司
值fx和fy是像素焦距,对于 正方形像素.值ox和oy是本金的偏移量 从图像框的左上角指向.所有值都是 以像素表示.
The values fx and fy are the pixel focal length, and are identical for square pixels. The values ox and oy are the offsets of the principal point from the top-left corner of the image frame. All values are expressed in pixels.
我得到的fx
和fy
的数字相同,即1515.481
.
I am getting the same number for fx
and fy
, that is 1515.481
.
要获取以毫米为单位的真实焦距,
To obtain the real focal length in millimeters,
- 另一页说
FC = fx/sx = fy/sy
,其中sx
和sy
是图像尺寸的宽度和高度,我想会给我两个数字,因为fx
=fy
...这又回到了零平方.
- This page says I need to use this formula:
F(mm) = F(pixels) * SensorWidth(mm) / ImageWidth (pixel)
but I don't have the sensor dimensions. - this other page says
FC = fx/sx = fy/sy
, wheresx
andsy
are the image dimensions width and height, what I suppose will give me two numbers, becausefx
=fy
... and this is back to square zero.
在iPhone 11上,ARCamera捕获具有以下尺寸的框架:1920x1440,至少此数字由属性camera.imageResolution
报告.
On iPhone 11, ARCamera captures a frame with the following dimensions: 1920x1440, at least this number is reported by the property camera.imageResolution
.
以精神健康的名义,有没有办法获取RealityKit
使用的ARCamera
的焦距?
In the name of mental sanity, is there a way to get the focal length of ARCamera
used by RealityKit
?
推荐答案
ARKit和RealityKit确实具有相同的focal length
参数值.这是因为这两个框架应该一起工作.而且,尽管目前没有ARView
的focal length
实例属性,但是您可以在控制台中轻松打印 ARSCNView
或SCNView
的焦距.
ARKit and RealityKit does definitely have identical values of focal length
parameter. That's because these two frameworks are supposed to work together. And although there's no focal length
instance property for ARView
at the moment, you can easily print in Console a focal length for ARSCNView
or SCNView
.
@IBOutlet var sceneView: ARSCNView!
sceneView.pointOfView?.camera?.focalLength
但是,考虑到ARKit,RealityKit和SceneKit框架没有使用 屏幕分辨率 ,而是使用了
However, take into account that ARKit, RealityKit and SceneKit frameworks don't use a screen resolution, they rather use a viewport size. A magnification factor for iPhones' viewports is usually 1/2
or 1/3
.
正如您在ARKit中所说的那样,有一个3x3相机矩阵,可以在2D相机平面和3D世界坐标空间之间进行转换.
As you said in ARKit there's a 3x3 camera matrix allowing you convert between the 2D camera plane and 3D world coordinate space.
var intrinsics: simd_float3x3 { get }
使用此矩阵可以打印4个重要参数:fx
,fy
,ox
和oy
.让我们将它们全部打印出来:
Using this matrix you can print 4 important parameters: fx
, fy
, ox
and oy
. Let's print them all:
DispatchQueue.main.asyncAfter(deadline: .now() + 2.0) {
print(" Focal Length: \(self.sceneView.pointOfView?.camera?.focalLength)")
print("Sensor Height: \(self.sceneView.pointOfView?.camera?.sensorHeight)")
// SENSOR HEIGHT IN mm
let frame = self.sceneView.session.currentFrame
// INTRINSICS MATRIX
print("Intrinsics fx: \(frame?.camera.intrinsics.columns.0.x)")
print("Intrinsics fy: \(frame?.camera.intrinsics.columns.1.y)")
print("Intrinsics ox: \(frame?.camera.intrinsics.columns.2.x)")
print("Intrinsics oy: \(frame?.camera.intrinsics.columns.2.y)")
}
对于iPhone X
,将打印以下值:
应用公式时,您会得到难以置信的结果(请继续阅读以找出原因).
When you apply your formulas you'll get a implausible result (read on to find out why).
iPhone X具有两个图像传感器,并且两个相机模块均装有 光学图像稳定器 ( OIS ). 广角镜头的焦距为28毫米 ,光圈为
f/1.8
,远摄镜头为56毫米,f/2.4
.
The iPhone X has two image sensors, and both camera modules are equipped with an optical image stabilizer (OIS). The wide-angle lens offers a 28-millimeter focal length and an aperture of
f/1.8
, while the telephoto lens is 56 millimeters andf/2.4
.
ARKit和RealityKit使用广角镜头后模块.在iPhone X机壳中,它是28毫米镜头.但是打印值focal length = 20.78 mm
呢?我相信28 mm
和20.78 mm
的值之间的差异是由于视频稳定占用了整个图像区域的25%左右的事实.这样做是为了最终获得最终图像的焦距28 mm
.
ARKit and RealityKit use a wide-angle lens rear module. In iPhone X case it's a 28-mm lens. But what about printed value focal length = 20.78 mm
, huh? I believe that the discrepancy between the value of 28 mm
and 20.78 mm
is due to the fact that video stabilization eats up about 25% of the total image area. This is done in order to eventually get a focal length's value of 28 mm
for final image.
红框是稳定阶段的裁切边距.
Red frame is a cropping margin at stabilisation stage.
这是我自己的结论.我没有找到关于该主题的任何参考材料,因此,如果我的观点是错误的(我承认可能是错误的),请不要严格地判断我. .
我们都知道,随着焦距的增加,相机抖动会被放大.因此,焦距值越小,相机晃动越小.这对于AR应用程序中不抖动的高质量世界跟踪非常重要.另外,我坚信光学图像稳定器在较低的焦距值下效果会更好.因此,ARKit工程师为AR体验选择一个较低的focal length
值(捕获更大的图像区域),然后在稳定之后,我们得到了该图像的修改版本,就像它具有focal length = 28 mm
一样,这并不奇怪.
We all know a fact that camera shake is magnified with an increase in focal length. So, the lower value of focal length is, the less camera shake is. It's very important for non-jittering high-quality world tracking in AR app. Also, I firmly believe that Optical Image Stabilisers work much better with lower values of focal length. Hence, it's not a surprise that ARKit engineers have chosen a lower value of focal length
for AR experience (capturing a wider image area), and then after stabilization, we get a modified version of the image, like it has focal length = 28 mm
.
因此,以我的拙见,为RealityKit和ARKit计算REAL focal length
是没有意义的,因为存在"FAKE". focal length
已由Apple工程师实施以提供强大的AR体验.
So, in my humble opinion, it makes no sense to calculate a REAL focal length
for RealityKit and ARKit 'cause there is a "FAKE" focal length
already implemented by Apple engineers for a robust AR experience.
这篇关于RealityKit中使用的相机的实际焦距是多少?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!