如何以每秒240帧的速度记录和保存? [英] How to record and save at 240 frames per second?

查看:158
本文介绍了如何以每秒240帧的速度记录和保存?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要以手机的最大帧速率(240 fps)录制和保存来自iPhone Xs的视频.保存的文件始终以30 fps的速度结束.我已经看过许多指南/文档/Stack Overflow帖子,但尚未找到合适的解决方案.我已经通过在VLC中打开记录的文件以及提取和计数帧进行了测试.

I need to record and save video from an iPhone Xs at the phone's max frame rate (240 fps). The saved file always ends up at 30 fps. I've been through a dozen guides/docs/Stack Overflow posts but have yet to hit on the right solution. I've tested by opening the recorded file in VLC as well as by extracting and counting frames.

我在做什么错了?

环境:Xcode 10.1,构建目标iOS 12.1,已在运行iOS 12.1.2的iPhone Xs上进行了测试

Environment: Xcode 10.1, build target iOS 12.1, tested on an iPhone Xs running iOS 12.1.2

我在这里访问设备并将其配置为它支持的最佳帧速率:

Here I access the device and configure it for the best frame rate it supports:

override func viewDidLoad() {
    super.viewDidLoad()
    let deviceDiscoverSession = AVCaptureDevice.default(.builtInWideAngleCamera, for: AVMediaType.video, position: .back)
    guard let backCameraDevice = deviceDiscoverSession else {
        print("Failed to open back, wide-angle camera")
        return
    }
    self.currentDevice = backCameraDevice
    do {
        let input = try AVCaptureDeviceInput(device: backCameraDevice)
        // configureDevice() // putting this here makes no difference
        self.session.addInput(input)
        configureDevice()
    } catch {
        print(error)
        return
    }
}

func configureDevice() {
    var bestFormat: AVCaptureDevice.Format? = nil
    var bestFrameRateRange: AVFrameRateRange? = nil
    var bestPixelArea: Int32 = 0
    for format in currentDevice!.formats {
        let dims: CMVideoDimensions = CMVideoFormatDescriptionGetDimensions(format.formatDescription)
        let pixelArea: Int32 = dims.width * dims.height
        let ranges = format.videoSupportedFrameRateRanges
        for range in ranges {
            if bestFrameRateRange==nil || range.maxFrameRate > bestFrameRateRange!.maxFrameRate || ((range.maxFrameRate == bestFrameRateRange!.maxFrameRate) && (pixelArea > bestPixelArea)) {
                bestFormat = format as AVCaptureDevice.Format
                bestFrameRateRange = range
                bestPixelArea = pixelArea
            }
        }
    }
    do {
        try currentDevice!.lockForConfiguration()
        if let best_format = bestFormat {
            currentDevice!.activeFormat = best_format
            currentDevice!.activeVideoMinFrameDuration = bestFrameRateRange!.minFrameDuration
            currentDevice!.activeVideoMaxFrameDuration = bestFrameRateRange!.maxFrameDuration
        }
    } catch {
        print(error)
    }

    let movieFileOutput = AVCaptureMovieFileOutput()

    if self.session.canAddOutput(movieFileOutput) {
        self.session.beginConfiguration()
        self.session.addOutput(movieFileOutput)
        self.session.sessionPreset = .high
        if let connection = movieFileOutput.connection(with: .video) {
            if movieFileOutput.availableVideoCodecTypes.contains(.h264) {
                movieFileOutput.setOutputSettings([AVVideoCodecKey:
                    AVVideoCodecType.h264], for: connection)
            }
        }
        self.session.commitConfiguration()
        self.movieFileOutput = movieFileOutput
    }
    currentDevice!.unlockForConfiguration()
}

当用户停止录制时,我调用一个函数,该函数部分包含以下代码,以将文件保存到temp目录(后移到应用程序的documents目录)

When the user stops recording, I call a function that in part contains the following code to save the file to the temp directory (later moved to the app's documents directory)

sessionQueue.async {
    if !self.isRecording {
        self.isRecording = true
        let movieFileOutputConnection = self.movieFileOutput!.connection(with: .video)
        movieFileOutputConnection?.videoOrientation = .landscapeRight
        let availableVideoCodecTypes = self.movieFileOutput!.availableVideoCodecTypes
        if availableVideoCodecTypes.contains(.h264) {
           self.movieFileOutput!.setOutputSettings([AVVideoCodecKey: AVVideoCodecType.h264], for: movieFileOutputConnection!)
        }
        let outputFileName = NSUUID().uuidString
        let outputFilePath = (NSTemporaryDirectory() as NSString).appendingPathComponent((outputFileName as NSString).appendingPathExtension("mp4")!)
        self.movieFileOutput?.startRecording(to: URL(fileURLWithPath: outputFilePath), recordingDelegate: self)
    } else {
        self.movieFileOutput?.stopRecording()
        self.isRecording = false
    }
}

典型的答案似乎是在将设备添加到会话后对其进行配置.在添加到会话之前或之后调用configure似乎没有什么作用.

The typical answer seems to be to configure the device after adding it to the session. Calling configure before or after adding to the session doesn't seem to make a difference.

我已经尝试在self.movieFileOutput?.startRecording()调用之前以及上面的显示位置配置movieFileOutput.两者给出相同的结果.

I've tried configuring movieFileOutput right before the self.movieFileOutput?.startRecording() call as well as where I show it above. Both give the same results.

推荐答案

我通过更深入地阅读 https://来解决这个问题stackoverflow.com/a/41109637/292947

configureDevice()中,我实际上是在设置self.session.sessionPreset = .high,而实际上我需要设置self.session.sessionPreset = .inputPriority,它与上述答案中建议的AVCaptureSessionPresetInputPriority值等效于Swift 4.

In configureDevice() I was setting self.session.sessionPreset = .high when in fact I need to set self.session.sessionPreset = .inputPriority which is the Swift 4 equivalent to the AVCaptureSessionPresetInputPriority value suggested in the above answer.

这篇关于如何以每秒240帧的速度记录和保存?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆