适用于iPhone X的原始深度贴图SDK [英] Raw Depth map SDK for IPhone X
问题描述
我做了一些搜索,发现了各种各样的例子,关于iPhone X Face ID的文档,以及它如何用于身份验证,动画表情符号等各种内容。
I did some search and found various examples, documentation on iPhone X Face ID and how it can be used for various stuff like authentication, animated emojis.
想要检查是否有API / SDK来获取从iPhone X传感器到应用程序的原始深度图?
Wanted to check if there is an API/SDK to get the raw depth map from iPhone X sensor to the app?
根据我的理解深度计算是基于投影模式完成的。这可用于获取传感器前方任何物体的深度剖面。 (可能取决于对象的纹理。)
From my understanding the depth calculation is done based on the projected pattern. This can be used to get depth profile of any object in front of the sensor. (Might be dependent on the texture of the object.)
推荐答案
您至少需要Xcode 9.1中的iOS 11.1 SDK (截至本文撰写时均为测试版)。有了它, builtInTrueDepthCamera
成为用于选择捕获设备的相机类型之一:
You'll need at least the iOS 11.1 SDK in Xcode 9.1 (both in beta as of this writing). With that, builtInTrueDepthCamera
becomes one of the camera types you use to select a capture device:
let device = AVCaptureDevice.default(.builtInTrueDepthCamera, for: .video, position: .front)
然后您可以使用TrueDepth设置 AVCaptureSession
相机设备,并且可以使用该捕捉会话来捕捉深度信息,就像iPhone 7 Plus和8 Plus上的后置双摄像头一样:
Then you can go on to set up an AVCaptureSession
with the TrueDepth camera device, and can use that capture session to capture depth information much like you can with the back dual camera on iPhone 7 Plus and 8 Plus:
-
使用
AVCapturePhotoOutput打开照片的深度拍摄。
isDepthDataDeliveryEnabled
,然后使用AVCapturePhotoSettings拍摄照片。
isDepthDataDeliveryEnabled
。您可以阅读depthData
来自AVCapturePhoto
捕获后收到的对象,或打开embedsDepthDataInPhoto
如果您只想触发并忘记(稍后从捕获的图像文件中读取数据)。
Turn on depth capture for photos with
AVCapturePhotoOutput.
isDepthDataDeliveryEnabled
, then snap a picture withAVCapturePhotoSettings.
isDepthDataDeliveryEnabled
. You can read thedepthData
from theAVCapturePhoto
object you receive after the capture, or turn onembedsDepthDataInPhoto
if you just want to fire and forget (and read the data from the captured image file later).
获取深度图与活饲料 AVCaptureDepthDataOutput
。那个就像视频数据输出;而不是直接录制到电影文件,它为您的代表提供了一个定时的图像序列(或在这种情况下,深度)缓冲区。如果您还要同时捕获视频,请 AVCaptureDataOutputSynchronizer
可能非常方便确保您获得协调的深度贴图和彩色边框。
Get a live feed of depth maps with AVCaptureDepthDataOutput
. That one is like the video data output; instead of recording directly to a movie file, it gives your delegate a timed sequence of image (or in this case, depth) buffers. If you're also capturing video at the same time, AVCaptureDataOutputSynchronizer
might be handy for making sure you get coordinated depth maps and color frames together.
正如Apple的设备兼容性文档说明,您需要选择 builtInTrueDepthCamera
设备可获取任何这些深度捕获选项。如果您选择前面的 builtInWideAngleCamera
,它就像任何其他自拍相机一样,只捕捉照片和视频。
As Apple's Device Compatibility documentation notes, you need to select the builtInTrueDepthCamera
device to get any of these depth capture options. If you select the front-facing builtInWideAngleCamera
, it becomes like any other selfie camera, capturing only photo and video.
只是强调:从API的角度来看,在iPhone X上使用前置TrueDepth相机捕捉深度是很像在iPhone 7 Plus和8 Plus上使用背面双摄像头捕捉深度。因此,如果您想深入了解所有这些深度捕获业务的工作原理,以及您可以使用捕获的深度信息做些什么,请查看 WWDC17会议507:捕捉iPhone摄影的深度谈话。
Just to emphasize: from an API point of view, capturing depth with the front-facing TrueDepth camera on iPhone X is a lot like capturing depth with the back-facing dual cameras on iPhone 7 Plus and 8 Plus. So if you want a deep dive on how all this depth capture business works in general, and what you can do with captured depth information, check out the WWDC17 Session 507: Capturing Depth in iPhone Photography talk.
这篇关于适用于iPhone X的原始深度贴图SDK的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!