iOS Swift-AVCaptureSession-捕获符合帧速率的帧 [英] iOS Swift - AVCaptureSession - Capture frames respecting frame rate

查看:110
本文介绍了iOS Swift-AVCaptureSession-捕获符合帧速率的帧的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试构建一个应用程序,该应用程序将以摄像头捕获帧并使用OpenCV处理这些帧,然后再将这些文件以特定帧速率保存到设备中.

I'm trying to build an app which will capture frames from the camera and process them with OpenCV before saving those files to the device, but at a specific frame rate.

我目前遇到的问题是 AVCaptureVideoDataOutputSampleBufferDelegate 似乎不尊重 AVCaptureDevice.activeVideoMinFrameDuration AVCaptureDevice.activeVideoMaxFrameDuration 设置.

What I'm stuck on at the moment is the fact that AVCaptureVideoDataOutputSampleBufferDelegate doesn't appear to respect the AVCaptureDevice.activeVideoMinFrameDuration, or AVCaptureDevice.activeVideoMaxFrameDuration settings.

captureOutput 的运行速度远远快于每秒2帧.

captureOutput runs far quicker than 2 frames per second as the above settings would indicate.

您碰巧知道有没有代表的情况下如何实现这一目标?

Do you happen to know how one could achieve this, with or without the delegate?

ViewController:

override func viewDidLoad() {
    super.viewDidLoad()

}

override func viewDidAppear(animated: Bool) {
    setupCaptureSession()
}

func setupCaptureSession() {

    let session : AVCaptureSession = AVCaptureSession()
    session.sessionPreset = AVCaptureSessionPreset1280x720

    let videoDevices : [AVCaptureDevice] = AVCaptureDevice.devices() as! [AVCaptureDevice]

    for device in videoDevices {
        if device.position == AVCaptureDevicePosition.Back {
            let captureDevice : AVCaptureDevice = device

            do {
                try captureDevice.lockForConfiguration()
                captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 2)
                captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, 2)
                captureDevice.unlockForConfiguration()

                let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

                if session.canAddInput(input) {
                    try session.addInput(input)
                }

                let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

                let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
                output.setSampleBufferDelegate(self, queue: dispatch_queue)

                session.addOutput(output)

                session.startRunning()

                let previewLayer = AVCaptureVideoPreviewLayer(session: session)
                previewLayer.connection.videoOrientation = .LandscapeRight

                let previewBounds : CGRect = CGRectMake(0,0,self.view.frame.width/2,self.view.frame.height+20)
                previewLayer.backgroundColor = UIColor.blackColor().CGColor
                previewLayer.frame = previewBounds
                previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
                self.imageView.layer.addSublayer(previewLayer)

                self.previewMat.frame = CGRectMake(previewBounds.width, 0, previewBounds.width, previewBounds.height)

            } catch _ {

            }
            break
        }
    }

}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    self.wrapper.processBuffer(self.getUiImageFromBuffer(sampleBuffer), self.previewMat)
}

推荐答案

所以我已经解决了问题.

So I've figured out the problem.

activeVideoMinFrameDuration 属性上方的 AVCaptureDevice.h 的注释部分中,其指出:

In the comments section for AVCaptureDevice.h above the activeVideoMinFrameDuration property it states:

在iOS上,接收者的activeVideoMinFrameDuration重置为其以下条件下的默认值:

On iOS, the receiver's activeVideoMinFrameDuration resets to its default value under the following conditions:

  • 接收方的activeFormat发生更改
  • 接收方的AVCaptureDeviceInput的会话的sessionPreset更改
  • 将接收者的AVCaptureDeviceInput添加到会话中

最后一个要点是引起我的问​​题,因此执行以下操作可以为我解决问题:

The last bullet point was causing my problem, so doing the following solved the problem for me:

        do {

            let input : AVCaptureDeviceInput = try AVCaptureDeviceInput(device: captureDevice)

            if session.canAddInput(input) {
                try session.addInput(input)
            }

            try captureDevice.lockForConfiguration()
            captureDevice.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 2)
            captureDevice.activeVideoMaxFrameDuration = CMTimeMake(value: 1, timescale: 2)
            captureDevice.unlockForConfiguration()

            let output : AVCaptureVideoDataOutput = AVCaptureVideoDataOutput()

            let dispatch_queue : dispatch_queue_t = dispatch_queue_create("streamoutput", nil)
            output.setSampleBufferDelegate(self, queue: dispatch_queue)

            session.addOutput(output)

这篇关于iOS Swift-AVCaptureSession-捕获符合帧速率的帧的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆