在iOS中使用16位PCM生成音调,AudioEngine.connect()抛出AUSetFormat:错误-10868 [英] Generating a tone in iOS with 16 bit PCM, AudioEngine.connect() throws AUSetFormat: error -10868

查看:166
本文介绍了在iOS中使用16位PCM生成音调,AudioEngine.connect()抛出AUSetFormat:错误-10868的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下代码用于生成给定频率和持续时间的音频音调.它大致基于此答案,可以在Android上执行相同的操作(感谢@Steve Pomeroy):

I have the following code for generating an audio tone of given frequency and duration. It's loosely based on this answer for doing the same thing on Android (thanks: @Steve Pomeroy):

https://stackoverflow.com/a/3731075/973364

import Foundation
import CoreAudio
import AVFoundation
import Darwin

   class AudioUtil {

    class func play(frequency: Int, durationMs: Int) -> Void {
        let sampleRateHz: Double = 8000.0
        let numberOfSamples = Int((Double(durationMs) / 1000 * sampleRateHz))
        let factor: Double = 2 * M_PI / (sampleRateHz/Double(frequency))

        // Generate an array of Doubles.
        var samples = [Double](count: numberOfSamples, repeatedValue: 0.0)

        for i in 1..<numberOfSamples {
            let sample = sin(factor * Double(i))
            samples[i] = sample
        }

        // Convert to a 16 bit PCM sound array.
        var index = 0
        var sound = [Byte](count: 2 * numberOfSamples, repeatedValue: 0)

        for doubleValue in samples {
            // Scale to maximum amplitude. Int16.max is 37,767.
            var value = Int16(doubleValue * Double(Int16.max))

            // In a 16 bit wav PCM, first byte is the low order byte.
            var firstByte = Int16(value & 0x00ff)
            var secondByteHighOrderBits = Int32(value) & 0xff00
            var secondByte = Int16(secondByteHighOrderBits >> 8) // Right shift.

            // println("\(doubleValue) -> \(value) -> \(firstByte), \(secondByte)")

            sound[index++] = Byte(firstByte)
            sound[index++] = Byte(secondByte)
        }

        let format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatInt16, sampleRate: sampleRateHz, channels:AVAudioChannelCount(1), interleaved: false)
        let buffer = AudioBuffer(mNumberChannels: 1, mDataByteSize: UInt32(sound.count), mData: &sound)
        let pcmBuffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: AVAudioFrameCount(sound.count))
        let audioEngine = AVAudioEngine()
        let audioPlayer = AVAudioPlayerNode()

        audioEngine.attachNode(audioPlayer)
        // Runtime error occurs here:
        audioEngine.connect(audioPlayer, to: audioEngine.mainMixerNode, format: format)
        audioEngine.startAndReturnError(nil)

        audioPlayer.play()
        audioPlayer.scheduleBuffer(pcmBuffer, atTime: nil, options: nil, completionHandler: nil)
    }
}

我在运行时在AVAudioEngine上调用connect()时遇到的错误是:

The error I get at runtime when calling connect() on the AVAudioEngine is this:

ERROR:     [0x3bfcb9dc] AVAudioNode.mm:521: AUSetFormat: error -10868
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'

我生成的不是AVAudioCommonFormat.PCMFormatInt16吗?

Is what I'm generating not really AVAudioCommonFormat.PCMFormatInt16?

这是仅使用一个缓冲区作为PCMFormatFloat32的另一种更简单的尝试.没有错误,也没有声音.

Here's another, simpler attempt using only one buffer as PCMFormatFloat32. There's no error, but no sound either.

import AVFoundation

class AudioManager:NSObject {

    let audioPlayer = AVAudioPlayerNode()

    lazy var audioEngine: AVAudioEngine = {
        let engine = AVAudioEngine()

        // Must happen only once.
        engine.attachNode(self.audioPlayer)

        return engine
    }()

    func play(frequency: Int, durationMs: Int, completionBlock:dispatch_block_t!) {
        var error: NSError?

        var mixer = audioEngine.mainMixerNode
        var sampleRateHz: Float = Float(mixer.outputFormatForBus(0).sampleRate)
        var numberOfSamples = AVAudioFrameCount((Float(durationMs) / 1000 * sampleRateHz))

        var format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: Double(sampleRateHz), channels: AVAudioChannelCount(1), interleaved: false)

        var buffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: numberOfSamples)
        buffer.frameLength = numberOfSamples

        // Generate sine wave
        for var i = 0; i < Int(buffer.frameLength); i++ {
            var val = sinf(Float(frequency) * Float(i) * 2 * Float(M_PI) / sampleRateHz)

            // log.debug("val: \(val)")

            buffer.floatChannelData.memory[i] = val * 0.5
        }

        // Audio engine
        audioEngine.connect(audioPlayer, to: mixer, format: format)

        log.debug("Sample rate: \(sampleRateHz), samples: \(numberOfSamples), format: \(format)")

        if !audioEngine.startAndReturnError(&error) {
            log.debug("Error: \(error)")
        }

        // Play player and buffer
        audioPlayer.play()
        audioPlayer.scheduleBuffer(buffer, atTime: nil, options: nil, completionHandler: completionBlock)
    }
}

谢谢:Thomas Royal( http://www.tmroyal. com/playing-sounds-in-swift-audioengine.html )

Thanks: Thomas Royal (http://www.tmroyal.com/playing-sounds-in-swift-audioengine.html)

推荐答案

问题是,当退出play()函数时,播放器会被清理并且从未完成(或几乎未启动)播放.这是一个很笨拙的解决方案:在从play()返回之前,睡与样本一样长的时间.

The problem was that when falling out of the play() function, the player was getting cleaned up and never completed (or barely started) playing. Here's one fairly clumsy solution to that: sleep for as long as the sample before returning from play().

我会接受一个更好的答案,该方法避免了在有人要发布游戏的情况下不清理播放器而不必这样做.

I'll accept a better answer that avoids having to do this by not having the player cleaned up if anyone wants to post one.

import AVFoundation

class AudioManager: NSObject, AVAudioPlayerDelegate {

    let audioPlayerNode = AVAudioPlayerNode()

    var waveAudioPlayer: AVAudioPlayer?

    var playing: Bool! = false

    lazy var audioEngine: AVAudioEngine = {
        let engine = AVAudioEngine()

        // Must happen only once.
        engine.attachNode(self.audioPlayerNode)

        return engine
    }()

    func playWaveFromBundle(filename: String, durationInSeconds: NSTimeInterval) -> Void {
        var error: NSError?
        var sound = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(filename, ofType: "wav")!)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        self.waveAudioPlayer = AVAudioPlayer(contentsOfURL: sound, error: &error)
        self.waveAudioPlayer!.delegate = self

        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        log.verbose("Playing \(sound)")

        self.waveAudioPlayer!.prepareToPlay()

        playing = true

        if !self.waveAudioPlayer!.play() {
            log.error("Failed to play")
        }

        // If we don't block here, the player stops as soon as this function returns. While we'd prefer to wait for audioPlayerDidFinishPlaying() to be called here, it's never called if we block here. Instead, pass in the duration of the wave file and simply sleep for that long.
        /*
        while (playing!) {
            NSThread.sleepForTimeInterval(0.1) // seconds
        }
        */

        NSThread.sleepForTimeInterval(durationInSeconds)

        log.verbose("Done")
    }

    func play(frequency: Int, durationInMillis: Int, completionBlock:dispatch_block_t!) -> Void {
        var session = AVAudioSession.sharedInstance()
        var error: NSError?

        if !session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error) {
            log.error("Error: \(error)")
            return
        }

        var mixer = audioEngine.mainMixerNode
        var sampleRateHz: Float = Float(mixer.outputFormatForBus(0).sampleRate)
        var numberOfSamples = AVAudioFrameCount((Float(durationInMillis) / 1000 * sampleRateHz))

        var format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: Double(sampleRateHz), channels: AVAudioChannelCount(1), interleaved: false)

        var buffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: numberOfSamples)
        buffer.frameLength = numberOfSamples

        // Generate sine wave
        for var i = 0; i < Int(buffer.frameLength); i++ {
            var val = sinf(Float(frequency) * Float(i) * 2 * Float(M_PI) / sampleRateHz)

            // log.debug("val: \(val)")

            buffer.floatChannelData.memory[i] = val * 0.5
        }

        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        // Audio engine
        audioEngine.connect(audioPlayerNode, to: mixer, format: format)

        log.debug("Sample rate: \(sampleRateHz), samples: \(numberOfSamples), format: \(format)")

        if !audioEngine.startAndReturnError(&error) {
            log.error("Error: \(error)")
            return
        }

        // TODO: Check we're not in the background. Attempting to play audio while in the background throws:
        //   *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error 561015905'

        // Play player and schedule buffer
        audioPlayerNode.play()
        audioPlayerNode.scheduleBuffer(buffer, atTime: nil, options: nil, completionHandler: completionBlock)

        // If we don't block here, the player stops as soon as this function returns.
        NSThread.sleepForTimeInterval(Double(durationInMillis) * 1000.0) // seconds
    }

    // MARK: AVAudioPlayerDelegate

    func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool) {
        log.verbose("Success: \(flag)")

        playing = false
    }

    func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!) {
        log.verbose("Error: \(error)")

        playing = false
    }

    // MARK: NSObject overrides

    deinit {
        log.verbose("deinit")
    }

}

对于上下文,此AudioManager是我的AppDelegate上的延迟加载属性:

For context, this AudioManager is a lazy loaded property on my AppDelegate:

lazy var audioManager: AudioManager = {
        return AudioManager()
    }()

这篇关于在iOS中使用16位PCM生成音调,AudioEngine.connect()抛出AUSetFormat:错误-10868的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆