在指定的时间段内运行和暂停 ARSession [英] Run and Pause an ARSession in a specified period of time

查看:29
本文介绍了在指定的时间段内运行和暂停 ARSession的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发具有手势识别功能的 ARKit/Vision iOS 应用程序.我的应用程序有一个包含单个 UIView 的简单 UI.根本没有 ARSCNView/ARSKView.我将一系列捕获的 ARFrames 放入 CVPixelBuffer 然后我用于 VNRecognizedObjectObservation.

I'm developing ARKit/Vision iOS app with gesture recognition. My app has a simple UI containing single UIView. There's no ARSCNView/ARSKView at all. I'm putting a sequence of captured ARFrames into CVPixelBuffer what then I use for VNRecognizedObjectObservation.

我不需要来自会话的任何跟踪数据.我只需要 currentFrame.capturedImage 用于 CVPixelBuffer.我需要以 30 fps 的速度捕获 ARFrame.60 fps 帧率过高.

I don't need any tracking data from a session. I just need currentFrame.capturedImage for CVPixelBuffer. And I need to capture ARFrames at 30 fps. 60 fps is excessive frame rate.

preferredFramesPerSecond 实例属性在我的情况下绝对没用,因为它控制渲染 ARSCNView/ARSKView.我没有 ARViews.并且不会影响会话的帧率.

preferredFramesPerSecond instance property is absolutely useless in my case, because it controls frame rate for rendering an ARSCNView/ARSKView. I have no ARViews. And it doesn't affect session's frame rate.

因此,我决定使用 run()pause() 方法来降低会话的帧速率.

So, I decided to use run() and pause() methods to decrease a session's frame rate.

问题

我想知道如何在指定的时间段内自动运行暂停一个ARSession?runpause 方法的持续时间必须为 16 ms(或 0.016 秒).我想通过 DispatchQueue 可能是可能的.但我不知道如何实现它.

I'd like to know how to automatically run and pause an ARSession in a specified period of time? The duration of run and pause methods must be 16 ms (or 0.016 sec). I suppose it might be possible through DispatchQueue. But I don't know how to implement it.

怎么做?

这是一个伪代码:

session.run(configuration)

    /*  run lasts 16 ms  */

session.pause()

    /*  pause lasts 16 ms  */

session.run(session.configuration!)

    /*  etc...  */

附言我的应用中既不能使用 CocoaPod 也不能使用 Carthage.

更新:这是关于如何检索和使用 ARSession 的 currentFrame.capturedImage.

Update: It's about how ARSession's currentFrame.capturedImage is retrieved and used.

let session = ARSession()

override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)

    session.delegate = self
    let configuration = ARImageTrackingConfiguration() // 6DOF
    configuration.providesAudioData = false
    configuration.isAutoFocusEnabled = true            
    configuration.isLightEstimationEnabled = false
    configuration.maximumNumberOfTrackedImages = 0
    session.run(configuration)  

    spawnCoreMLUpdate()
}

func spawnCoreMLUpdate() {    // Spawning new async tasks

    dispatchQueue.async {
        self.spawnCoreMLUpdate()
        self.updateCoreML()
    }
}

func updateCoreML() {

    let pixelBuffer: CVPixelBuffer? = (session.currentFrame?.capturedImage)
    if pixelBuffer == nil { return }
    let ciImage = CIImage(cvPixelBuffer: pixelBuffer!)
    let imageRequestHandler = VNImageRequestHandler(ciImage: ciImage, options: [:])
    do {
        try imageRequestHandler.perform(self.visionRequests)
    } catch {
        print(error)
    }
}

推荐答案

我不认为 run()pause() 策略是要走的路因为 DispatchQueue API 不是为实时准确性而设计的.这意味着不能保证每次暂停都是 16 毫秒.最重要的是,重新启动会话可能不会立即生效,而且可能会增加更多延迟.

I don't think the run() and pause() strategy is the way to go because the DispatchQueue API is not designed for realtime accuracy. Which means there will be no guarantee that the pause will be 16ms every time. On top of that, restarting a session might not be immediate and could add more delay.

此外,您共享的代码最多只能捕获一张图像,并且由于 session.run(configuration) 是异步的,因此可能不会捕获任何帧.

Also, the code you shared will at most capture only one image and as session.run(configuration) is asynchronous will probably capture no frame.

由于您没有使用 ARSCNView/ARSKView,唯一的方法是实现 ARSession 委托,以便在每个捕获的帧时收到通知.

As you're not using ARSCNView/ARSKView the only way is to implement the ARSession delegate to be notified of every captured frame.

当然,代理很可能每 16 毫秒被调用一次,因为相机就是这样工作的.但是您可以决定要处理哪些帧.通过使用帧的时间戳,您可以每 32 毫秒处理一个帧并丢弃其他帧.这相当于 30 fps 的处理.

Of course the delegate will most likely be called every 16ms because that's how the camera works. But you can decide which frames you are going to process. By using the timestamp of the frame you can process a frame every 32ms and drop the other ones. Which is equivalent to a 30 fps processing.

这是一些帮助您入门的代码,请确保 dispatchQueue 不是并发处理您的缓冲区:

Here is some code to get you started, make sure that dispatchQueue is not concurrent to process your buffers sequentially:

var lastProcessedFrame: ARFrame?

func session(_ session: ARSession, didUpdate frame: ARFrame) {
  dispatchQueue.async {
    self.updateCoreML(with: frame)
  }
}

private func shouldProcessFrame(_ frame: ARFrame) -> Bool {
  guard let lastProcessedFrame = lastProcessedFrame else {
    // Always process the first frame
    return true
  }
  return frame.timestamp - lastProcessedFrame.timestamp >= 0.032 // 32ms for 30fps
}

func updateCoreML(with frame: ARFrame) {

  guard shouldProcessFrame(frame) else {
    // Less than 32ms with the previous frame
    return
  }
  lastProcessedFrame = frame
  let pixelBuffer = frame.capturedImage
  let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:])
  do {
    try imageRequestHandler.perform(self.visionRequests)
  } catch {
    print(error)
  }
}

这篇关于在指定的时间段内运行和暂停 ARSession的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆