如何在ARKit 1.5上获取镜头位置 [英] How to get the lens position on ARKit 1.5

查看:96
本文介绍了如何在ARKit 1.5上获取镜头位置的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在ARKit 1.5之前,我们无法调整相机的焦点,并且获取镜头位置将始终返回相同的值.但是,在ARKit 1.5中,我们现在可以通过设置 ARWorldTrackingConfiguration.isAutoFocusEnabled 来使用自动对焦.我的问题是,是否有任何方法可以从ARKit获取当前镜头位置,以便可以对我的虚拟对象应用散焦效果?我看了一些可以存储此信息的类,例如 ARFrame ARSession ,但它们似乎没有这样的字段.

Before ARKit 1.5, we had no way to adjust the focus of the camera and getting the lens position would always return the same value. With ARKit 1.5, however, we can now use autofocus by setting ARWorldTrackingConfiguration.isAutoFocusEnabled. My question is that, is there any way to get the current lens position from ARKit so that I can apply an out-of-focus effect on my virtual objects? I had a look at some classes where this information may be stored, like ARFrame or ARSession, but they don't seem to have such a field.

我偶然发现了此线程,OP表示他能够通过使用一些专用API来设置镜头位置,但这是在ARKit 1.5发行之前,这是让您的应用被App Store拒绝的可靠方法.

I've stumbled upon this thread where the OP says that he was able to set the lens position by using some private API's, but this was before the release of ARKit 1.5 and a sure way to get your app rejected by the App Store.

是否有任何合法方法可以从ARKit获得镜头位置?

Are there any legal ways to get the lens position from ARKit?

推荐答案

我的猜测是:可能不是,但是有些事情您可以尝试.

My guess is: probably not, but there are things you might try.

intrinsics 矩阵出售由 ARCamera 定义,以像素为单位表示焦距.但是我不确定这是否可以(与光圈等其他方式)一起定义深度模糊效果.它也不会在自动对焦期间改变(至少可以测试的部分).

The intrinsics matrix vended by ARCamera is defined to express focal length in pixel units. But I’m not sure if that’s a measurement you could (together with others like aperture) define a depth blur effect with. Nor whether it changes during autofocus (that part you can test, at least).

基于ARKit的AVCapture API提供了 lensPosition 指示器,但这是一个通用的浮点值.零是最小对焦距离,一是最大对焦距离,并且在没有实际测量的情况下,这对应于您不知道要为每个可能的镜头位置应用多少模糊(或SceneKit中基于物理的相机设置,要使用的Unity设置).

The AVCapture APIs underlying ARKit offer a lensPosition indicator, but it’s a generic floating point value. Zero is minimum focus distance, one is maximum, and with no real world measurement this corresponds to you wouldn’t know how much blur to apply (or what physically based camera settings in SceneKit, Unity settings to use) for each possible lens position.

即使您可以使用 lensPosition ,也没有API可以让ARSession使用捕获设备.不过,您可以放心地假设它是后置(宽)相机.

Even if you could put lensPosition to use, there’s no API for getting the capture device used by an ARSession. You can probably safely assume it’s the back (wide) camera, though.

这篇关于如何在ARKit 1.5上获取镜头位置的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆