iOS Swift读取PCM缓冲区 [英] IOS Swift read PCM Buffer

查看:309
本文介绍了iOS Swift读取PCM缓冲区的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个Android项目,用于从麦克风Buffer中读取带有PCM数据的short[]数组,以进行实时分析.我需要将此功能转换为iOS Swift.在Android中,它非常简单,看起来像这样.

I have a project for Android reading a short[] array with PCM data from microphone Buffer for live analysis. I need to convert this functionality to iOS Swift. In Android it is very simple and looks like this..

import android.media.AudioFormat;
import android.media.AudioRecord;
...
AudioRecord recorder = new AudioRecord(MediaRecorder.AudioSource.DEFAULT, someSampleRate, AudioFormat.CHANNEL_IN_MONO, AudioFormat.ENCODING_PCM_16BIT, AudioRecord.getMinBufferSize(...));
recorder.startRecording();

后来我读了缓冲区

recorder.read(data, offset, length); //data is short[]

(这就是我想要的)

文档: https://developer.android.com/reference/android/media/AudioRecord.html

我是Swift和iOS的新手.我已经阅读了很多有关AudioToolkit,... Core和其他内容的文档.我发现的只是C ++/Obj-C和Bridging Swift Header解决方案.这对我来说是很大的进步和过时的事情.

I'm very new to Swift and iOS. I've read a lot of documentation about AudioToolkit, ...Core and whatever. All I found is C++/Obj-C and Bridging Swift Header solutions. Thats much to advanced and outdated for me.

现在我可以使用AVFoundation将PCM数据读取到CAF文件中

For now I can read PCM-Data to a CAF-File with AVFoundation

settings = [
        AVLinearPCMBitDepthKey: 16 as NSNumber,
        AVFormatIDKey: Int(kAudioFormatLinearPCM),
        AVLinearPCMIsBigEndianKey: 0 as NSNumber,
        AVLinearPCMIsFloatKey: 0 as NSNumber,
        AVSampleRateKey: 12000.0,
        AVNumberOfChannelsKey: 1 as NSNumber,
        ]
...
recorder = try AVAudioRecorder(URL: someURL, settings: settings)
recorder.delegate = self
recorder.record()

但这不是我想要的(或?).有没有一种优雅的方法来实现上述android read功能?我需要从麦克风缓冲区中获取一个样本数组.还是我需要对记录的CAF文件进行读取?

But that's not what I'm looking for (or?). Is there an elegant way to achieve the android read functionality described above? I need to get a sample-array from the microphone buffer. Or do i need to do the reading on the recorded CAF file?

非常感谢!请通过简单的解释或代码示例帮助我. iOS术语还不是我的;-)

Thanks a lot! Please help me with easy explanations or code examples. iOS terminology is not mine yet ;-)

推荐答案

如果您不介意浮点采样和48kHz,则可以像这样快速从麦克风获取音频数据:

If you don't mind floating point samples and 48kHz, you can quickly get audio data from the microphone like so:

let engine = AVAudioEngine()    // instance variable

func setup() {        
    let input = engine.inputNode!
    let bus = 0

    input.installTapOnBus(bus, bufferSize: 512, format: input.inputFormatForBus(bus)) { (buffer, time) -> Void in
        let samples = buffer.floatChannelData[0]
        // audio callback, samples in samples[0]...samples[buffer.frameLength-1]
    }

    try! engine.start()
}

这篇关于iOS Swift读取PCM缓冲区的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆