用于OpenCV的Iphone 6相机校准 [英] Iphone 6 camera calibration for OpenCV

查看:203
本文介绍了用于OpenCV的Iphone 6相机校准的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用OpenCV开发iOS增强现实应用程序。我在创建相机投影矩阵时遇到问题,以允许OpenGL叠加层直接映射到标记的顶部。我觉得这是因为我的iPhone 6相机没有正确校准到应用程序。我知道有OpenCV代码使用棋盘校准网络摄像头等,但我找不到校准嵌入式iPhone相机的方法。

Im developing an iOS Augmented Reality application using OpenCV. I'm having issues creating the camera projection matrix to allow the OpenGL overlay to map directly on top of the marker. I feel this is due to my iPhone 6 camera not being correctly calibrated to the application. I know there is OpenCV code to calibrate webcams etc using the chess board, but I can't find a way to calibrate my embedded iPhone camera.

有办法吗?或者iPhone 6有已知的估计值吗?其中包括:x和y中的焦距,x和y中的主要点,以及失真系数矩阵。

Is there a way? Or are there known estimate values for iPhone 6? Which include: focal length in x and y, primary point in x and y, along with the distortion coefficient matrix.

任何帮助将不胜感激。

Any help will be appreciated.

编辑:

扣除的价值如下(使用iPhone 6,相机Feed分辨率1280x720):

Deduced values are as follows (using iPhone 6, camera feed resolution 1280x720):

fx=1229
cx=360
fy=1153
cy=640

此代码可准确估计当前运行iOS的设备的焦距和主要点9.1。

This code provides an accurate estimate for the focal length and primary points for devices currently running iOS 9.1.

AVCaptureDeviceFormat *format = deviceInput.device.activeFormat;
CMFormatDescriptionRef fDesc = format.formatDescription;
CGSize dim = CMVideoFormatDescriptionGetPresentationDimensions(fDesc, true, true);

float cx = float(dim.width) / 2.0;
float cy = float(dim.height) / 2.0;

float HFOV = format.videoFieldOfView;
float VFOV = ((HFOV)/cx)*cy;

float fx = abs(float(dim.width) / (2 * tan(HFOV / 180 * float(M_PI) / 2)));
float fy = abs(float(dim.height) / (2 * tan(VFOV / 180 * float(M_PI) / 2)));

注意:

我的代码存在初始化问题。我建议一旦初始化并正确设置值,将它们保存到数据文件并读取此文件中的值。

I had an initialization issue with this code. I recommend once the values are initialised and correctly set, to save them to a data file and read this file in for the values.

推荐答案

在我的非OpenCV AR应用程序中,我使用iPhone相机的视野(FOV)来构建相机投影矩阵。它可以正常显示覆盖在摄像机视图顶部的太阳路径。
我不知道你需要多少精确度。可能只知道FOV就不够了。

In my non-OpenCV AR application I am using field of view (FOV) of the iPhone's camera to construct the camera projection matrix. It works alright for displaying the Sun path overlaid on top of the camera view. I don't know how much accuracy you need. It could be that knowing only FOV would not be enough you.

iOS API提供了一种获取摄像机视野的方法。我得到它:

iOS API provides a way to get field of view of the camera. I get it as so:

AVCaptureDevice  * camera = ...
AVCaptureDeviceFormat * format = camera.activeFormat;
float fieldOfView = format.videoFieldOfView;

获得FOV后我计算投影矩阵:

After getting the FOV I compute the projection matrix:

typedef double mat4f_t[16]; // 4x4 matrix in column major order    

mat4f_t projection;
createProjectionMatrix(projection,
                       GRAD_TO_RAD(fieldOfView),
                       viewSize.width/viewSize.height,
                       5.0f,
                       1000.0f);

其中

void createProjectionMatrix(
        mat4f_t mout, 
        float fovy,
        float aspect, 
        float zNear,
        float zFar)
{
    float f = 1.0f / tanf(fovy/2.0f);

    mout[0] = f / aspect;
    mout[1] = 0.0f;
    mout[2] = 0.0f;
    mout[3] = 0.0f;

    mout[4] = 0.0f;
    mout[5] = f;
    mout[6] = 0.0f;
    mout[7] = 0.0f;

    mout[8] = 0.0f;
    mout[9] = 0.0f;
    mout[10] = (zFar+zNear) / (zNear-zFar);
    mout[11] = -1.0f;

    mout[12] = 0.0f;
    mout[13] = 0.0f;
    mout[14] = 2 * zFar * zNear /  (zNear-zFar);
    mout[15] = 0.0f;
}

这篇关于用于OpenCV的Iphone 6相机校准的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆