如何从iOS上的照片输出中检索AVCameraCalibrationData? [英] How to retrieve AVCameraCalibrationData from photo output on iOS?

查看:134
本文介绍了如何从iOS上的照片输出中检索AVCameraCalibrationData?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在寻找与我在iOS设备上捕获的照片相关联的镜头畸变系数(注意:不是焦距或主点).据我了解,在iOS设备上执行此操作的唯一方法是使用 AVCameraCalibrationData .官方文档仅提供有关如何获取相机校准数据的信息.来自 AVDepthData ,但文档和此StackOverflow答案都暗示 AVCameraCalibrationData 不仅可以用于深度数据,还可以用于图像.

I'm looking to retrieve the lens distortion coefficients (note: not focal length nor principal point) associated with a photo that I capture on an iOS device. From my understanding, the only way to do so on a iOS device is to use AVCameraCalibrationData. The official documentation only provides information on how to retrieve the camera calibration data from AVDepthData, but both the documentation and this StackOverflow answer imply that AVCameraCalibrationData can be used with images, not just with depth data.

在捕获图像时是否可以检索 AVCameraCalibrationData 信息?如果是这样,是否有关于此功能的文档?

Is it possible to retrieve AVCameraCalibrationData information when you're capturing an image? If so, is there documentation around this functionality?

推荐答案

背景:当被问及相机校准时,这些堆栈溢出响应中有很多都在引用内部数据,但是校准数据通常包括内部数据,外部数据,镜头畸变,等等.所有这些都在您链接到的iOS文档中列出.

Background: A lot of these stack overflow responses are referencing intrinsic data when asked about camera calibration, but calibration data typically includes intrinsic data, extrinsic data, lens distortion, etc. Its all listed out here in the iOS documentation you linked to.

我将假设您已经安装了通用的相机应用程序代码.在该代码中,拍摄照片时,您可能会调用看起来像这样的photoOutput函数:

I am going to assume you have the general camera app code in place. In that code, when a picture is taken, you are probably going to make a call to the photoOutput function that looks likes something like this:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {...

输出参数将具有一个值,您可以参考该值以查看是否支持相机校准,该值称为 isCameraCalibrationDataDeliverySupported ,例如,要打印出来,请使用类似以下的内容:

The output parameter is going to have a value you can reference to see if camera calibration is supported called isCameraCalibrationDataDeliverySupported, so for example, to print that out, use something like this:

print("isCameraCalibrationDataDeliverySupported: \(output.isCameraCalibrationDataDeliverySupported)")

我链接到的文档中的注释,仅在特定情况下受支持:

Note in the documentation I linked to, it is only supported in specific scenarios:

仅当isDualCameraDualPhotoDeliveryEnabled属性为true.启用相机校准交付,设置照片设置中的isCameraCalibrationDataDeliveryEnabled属性对象."

"This property's value can be true only when the isDualCameraDualPhotoDeliveryEnabled property is true. To enable camera calibration delivery, set the isCameraCalibrationDataDeliveryEnabled property in a photo settings object."

这很重要,请注意这一点,以避免不必要的压力.使用实际值进行调试,并确保启用了正确的环境.

So that's important, pay attention to that to avoid unnecessary stress. Use the actual value to debug and make sure you have the proper environment enabled.

所有这些都准备就绪后,您应该从以下位置获取实际的摄像机校准数据:

With all that in place, you should get the actual camera calibration data from:

photo.cameraCalibrationData

只需拉出该对象即可获取您要查找的特定值,例如:

Just pull out of that object to get specific values you are looking for, such as:

photo.cameraCalibrationData?.extrinsicMatrix
photo.cameraCalibrationData?.intrinsicMatrix
photo.cameraCalibrationData?.lensDistortionCenter
etc.

基本上,您链接到的文档中列出的所有内容,以及我再次链接到的所有内容.

Basically everything that is listed in the documentation you linked to and that I linked to again.

要注意的另一件事是,如果您仅在寻找内在矩阵,则与其余这些值相比,可以轻松得多(即不像环境那样严格)获得该矩阵

One more thing to note, if you ARE looking for just the intrinsic matrix, that can be obtained much easier (i.e. not as stringent of an environment) than the rest of these values through the approach outlined in the the stack overflow. If you are using this for computer vision, which is what I am using it for, that is sometimes all that is needed. But for really cool stuff, you'll want it all.

这篇关于如何从iOS上的照片输出中检索AVCameraCalibrationData?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆