在 iPhone X 上的 ARKit ARSession 期间从前置摄像头录制视频 [英] Record video from front facing camera during ARKit ARSession on iPhone X

查看:55
本文介绍了在 iPhone X 上的 ARKit ARSession 期间从前置摄像头录制视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用 ARSession 结合 ARFaceTrackingConfiguration 来跟踪我的脸.同时,我想从我的 iPhone X 的前置摄像头录制视频.为此,我使用 AVCaptureSession 但一旦我开始录制,ARSession 被打断.

I'm using an ARSession combined with an ARFaceTrackingConfiguration to track my face. At the same time, I would like to record a video from the front facing camera of my iPhone X. To do so I'm using AVCaptureSession but as soon as I start recording, the ARSession gets interrupted.

这是两段代码:

// Face tracking
let configuration = ARFaceTrackingConfiguration()
    configuration.isLightEstimationEnabled = false
let session = ARSession()
session.run(configuration, options: [.removeExistingAnchors, .resetTracking])

// Video recording
let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)!
input = try! AVCaptureDeviceInput(device: camera)
session.addInput(input)
session.addOutput(output)

有人知道如何同时做这两件事吗?像 Snapchat 这样的应用程序允许用户同时记录和使用 True Depth 传感器,所以我想我的要求是完全可行的.谢谢!

Does anybody know how to do the two things at the same time? Apps like Snapchat allow users to record and use the True Depth sensor at the same time so I imagine what I'm asking is perfectly feasible. Thanks!

推荐答案

ARKit 运行自己的 AVCaptureSession,并且一次只能运行一个捕获会话——如果你运行一个捕获会话,你抢占了 ARKit 的,这会阻止 ARKit 工作.

ARKit runs its own AVCaptureSession, and there can be only one capture session running at a time — if you run a capture session, you preempt ARKit’s, which prevents ARKit from working.

但是,ARKit 确实提供了对从其捕获会话接收的相机像素缓冲区的访问,因此您可以通过将这些示例缓冲区提供给 AVAssetWriter 来录制视频.(这与您从 AVCaptureVideoDataOutput 录制视频时使用的工作流程基本相同……与 AVCaptureMovieFileOutput 相比,这是一种较低级别的视频录制方式.)

However, ARKit does provide access to the camera pixel buffers it receives from its capture session, so you can record video by feeding those sample buffers to an AVAssetWriter. (It’s basically the same workflow you’d use when recording video from AVCaptureVideoDataOutput... a lower-level way of doing video recording compared to AVCaptureMovieFileOutput.)

您还可以提供 ARKit 相机像素缓冲区(请参阅 ARFrame.captureImage) 到其他与实时相机图像一起使用的技术,如 Vision 框架.Apple 有一个示例代码项目来演示这种用法.

You can also feed the ARKit camera pixel buffers (see ARFrame.capturedImage) to other technologies that work with live camera imagery, like the Vision framework. Apple has a sample code project demonstrating such usage.

这篇关于在 iPhone X 上的 ARKit ARSession 期间从前置摄像头录制视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆