在iPhone X上的ARKit ARSession期间从前置摄像头录制视频 [英] Record video from front facing camera during ARKit ARSession on iPhone X

查看:503
本文介绍了在iPhone X上的ARKit ARSession期间从前置摄像头录制视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 ARSession 结合 ARFaceTrackingConfiguration 来追踪我的脸。同时,我想从我的iPhone X的前置摄像头录制视频。为此,我使用 AVCaptureSession 但是一旦我开始录制, ARSession 被中断。

I'm using an ARSession combined with an ARFaceTrackingConfiguration to track my face. At the same time, I would like to record a video from the front facing camera of my iPhone X. To do so I'm using AVCaptureSession but as soon as I start recording, the ARSession gets interrupted.

这是两段代码:

// Face tracking
let configuration = ARFaceTrackingConfiguration()
    configuration.isLightEstimationEnabled = false
let session = ARSession()
session.run(configuration, options: [.removeExistingAnchors, .resetTracking])

// Video recording
let camera = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .front)!
input = try! AVCaptureDeviceInput(device: camera)
session.addInput(input)
session.addOutput(output)

有人知道如何同时做这两件事吗?像Snapchat这样的应用程序允许用户同时记录和使用真实深度传感器,所以我想我所要求的是完全可行的。谢谢!

Does anybody know how to do the two things at the same time? Apps like Snapchat allow users to record and use the True Depth sensor at the same time so I imagine what I'm asking is perfectly feasible. Thanks!

推荐答案

ARKit运行自己的 AVCaptureSession ,可以有一次只运行一个捕获会话 - 如果你运行捕获会话,你抢占ARKit,这会阻止ARKit工作。

ARKit runs its own AVCaptureSession, and there can be only one capture session running at a time — if you run a capture session, you preempt ARKit’s, which prevents ARKit from working.

然而,ARKit确实提供了对相机的访问它从捕获会话接收的像素缓冲区,因此您可以通过将这些样本缓冲区提供给 AVAssetWriter 来录制视频。 (与 AVCaptureVideoDataOutput 录制视频时基本相同的工作流程...与 AVCaptureMovieFileOutput <相比,视频录制的低级方式/ code>。)

However, ARKit does provide access to the camera pixel buffers it receives from its capture session, so you can record video by feeding those sample buffers to an AVAssetWriter. (It’s basically the same workflow you’d use when recording video from AVCaptureVideoDataOutput... a lower-level way of doing video recording compared to AVCaptureMovieFileOutput.)

您还可以输入ARKit相机像素缓冲区(参见 ARFrame.capturedImage )其他适用于实时摄像机图像的技术,如愿景框架。 Apple有一个示例代码项目,证明了这种用法。

You can also feed the ARKit camera pixel buffers (see ARFrame.capturedImage) to other technologies that work with live camera imagery, like the Vision framework. Apple has a sample code project demonstrating such usage.

这篇关于在iPhone X上的ARKit ARSession期间从前置摄像头录制视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆