macOS/swift通过AVCaptureSession捕获音频 [英] macOS/swift Capture Audio with AVCaptureSession

查看:59
本文介绍了macOS/swift通过AVCaptureSession捕获音频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前正在尝试在Mac上实现一个简单的音频录制工具.由于我需要在内存中存储原始音频缓冲区,因此无法使用AVAudioRecorder,它只会将录制内容写入文件中.

I am currently trying to implement a simple audio recording tool on my Mac. Since I need the raw audio buffers in-memory, I cannot use AVAudioRecorder, which would just write the recording to a file.

我的方法是创建一个AVCaptureSession,Input(麦克风)和Output(AVCaptureAudioDataOutput)并启动会话.一切正常,但是从未调用输出的委托回调.

My approach is to create a AVCaptureSession, Input(Microphone) and Output(AVCaptureAudioDataOutput) and start the session. Everything works fine, however the delegate callback of the output is never called.

我确保在项目设置中添加麦克风/摄像头权限(以防万一).

I made sure to add mic/camera permissions (just in case) in the project settings.

也许有人可以帮助我!

这是我的代码:

import Foundation
import AVFoundation

class AudioCaptureSession: NSObject, AVCaptureAudioDataOutputSampleBufferDelegate {

    let settings = [
        AVFormatIDKey: kAudioFormatMPEG4AAC,
        AVNumberOfChannelsKey : 1,
        AVSampleRateKey : 44100]
    let captureSession = AVCaptureSession()

    override init() {
        super.init()

        let queue = DispatchQueue(label: "AudioSessionQueue", attributes: [])
        let captureDevice = AVCaptureDevice.default(for: AVMediaType.audio)
        var audioInput : AVCaptureDeviceInput? = nil
        var audioOutput : AVCaptureAudioDataOutput? = nil

        do {
            try captureDevice?.lockForConfiguration()
            audioInput = try AVCaptureDeviceInput(device: captureDevice!)
            captureDevice?.unlockForConfiguration()
            audioOutput = AVCaptureAudioDataOutput()
            audioOutput?.setSampleBufferDelegate(self, queue: queue)
            audioOutput?.audioSettings = settings
        } catch {
            print("Capture devices could not be set")
            print(error.localizedDescription)
        }

        if audioInput != nil && audioOutput != nil {
            captureSession.beginConfiguration()
            if (captureSession.canAddInput(audioInput!)) {
                captureSession.addInput(audioInput!)
            } else {
                print("cannot add input")
            }
            if (captureSession.canAddOutput(audioOutput!)) {
                captureSession.addOutput(audioOutput!)
            } else {
                print("cannot add output")
            }
            captureSession.commitConfiguration()

            print("Starting capture session")
            captureSession.startRunning()
        }
    }

    func captureOutput(_ output: AVCaptureOutput,
                       didOutput sampleBuffer: CMSampleBuffer,
                       from connection: AVCaptureConnection) {

        print("Audio data recieved")
    }
}

推荐答案

它是为我而来的.您没有显示如何使用它,但是您的 AudioCaptureSession 可能超出范围并被释放.

It's called for me. You don't show how you use it, but maybe your AudioCaptureSession is going out of scope and being deallocated.

这篇关于macOS/swift通过AVCaptureSession捕获音频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆