尝试通过多点连接将音频从麦克风流式传输到另一部手机 [英] Trying to stream audio from microphone to another phone via multipeer connectivity

查看:37
本文介绍了尝试通过多点连接将音频从麦克风流式传输到另一部手机的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试通过 Apples Multipeer Connectivity 框架将音频从麦克风流式传输到另一部 iPhone.为了进行音频捕获和播放,我使用了 AVAudioEngine(非常感谢 Rhythmic Fistman' 的回答 此处).

I am trying to stream audio from the microphone to another iPhone via Apples Multipeer Connectivity framework. To do the audio capturing and playback I am using AVAudioEngine (much thanks to Rhythmic Fistman's answer here).

我通过在输入上安装一个水龙头从麦克风接收数据,由此我得到一个 AVAudioPCMBuffer,然后我将其转换为一个 UInt8 数组,然后我将其流式传输到另一部手机.

I receive data from the microphone by installing a tap on the input, from this I am getting a AVAudioPCMBuffer which I then convert to an array of UInt8 which I then stream to the other phone.

但是当我将数组转换回 AVAudioPCMBuffer 时,我得到一个 EXC_BAD_ACCESS 异常,编译器指向我再次将字节数组转换为 AVAudioPCMBuffer 的方法.

But when I am converting the array back to an AVAudioPCMBuffer I get an EXC_BAD_ACCESS exception with the compiler pointing to the method where I am converting the byte array to AVAudioPCMBuffer again.

这是我在何处获取、转换和流式传输输入的代码:

Here is the code for where I'm taking, converting and streaming the input:

input.installTap(onBus: 0, bufferSize: 2048, format: input.inputFormat(forBus: 0), block: {
                (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

                let audioBuffer = self.typetobinary(buffer)
                stream.write(audioBuffer, maxLength: audioBuffer.count)
            })

我用于转换数据的两个函数(取自 Martin.R 的回答 此处):

My both functions for converting the data (taken from Martin.R's answer here):

func binarytotype <T> (_ value: [UInt8], _: T.Type) -> T {
    return value.withUnsafeBufferPointer {
        UnsafeRawPointer($0.baseAddress!).load(as: T.self)
    }

}

func typetobinary<T>(_ value: T) -> [UInt8] {
    var data = [UInt8](repeating: 0, count: MemoryLayout<T>.size)
    data.withUnsafeMutableBufferPointer {
        UnsafeMutableRawPointer($0.baseAddress!).storeBytes(of: value, as: T.self)
    }
    return data
}

在接收端:

func session(_ session: MCSession, didReceive stream: InputStream, withName streamName: String, fromPeer peerID: MCPeerID) {
    if streamName == "voice" {

        stream.schedule(in: RunLoop.current, forMode: .defaultRunLoopMode)
        stream.open()

        var bytes = [UInt8](repeating: 0, count: 8)
        stream.read(&bytes, maxLength: bytes.count)

        let audioBuffer = self.binarytotype(bytes, AVAudioPCMBuffer.self) //Here is where the app crashes

        do {
            try engine.start()

            audioPlayer.scheduleBuffer(audioBuffer, completionHandler: nil)
            audioPlayer.play()
       }catch let error {
            print(error.localizedDescription)

        }
    }
}

问题是我可以在传输之前来回转换字节数组并从中播放声音(在同一部手机中),但不能在接收端创建 AVAudioPCMBuffer.有谁知道为什么转换在接收端不起作用?这是正确的方法吗?

The thing is that I can convert the byte array back and forth and play sound from it before I stream it (in the same phone) but not create the AVAudioPCMBuffer on the receiving end. Does anyone know why the conversion doesn't work on the receiving end? Is this the right way to go?

对此的任何帮助、想法/意见将不胜感激.

Any help, thoughts/input about this would be much appreciated.

推荐答案

您的 AVAudioPCMBuffer 序列化/反序列化错误.

Your AVAudioPCMBuffer serialisation/deserialisation is wrong.

Swift3 的演员阵容发生了很大变化 &似乎比 Swift2 需要更多的复制.

Swift3's casting has changed a lot & seems to require more copying than Swift2.

以下是在 [UInt8]AVAudioPCMBuffers 之间转换的方法:

Here's how you can convert between [UInt8] and AVAudioPCMBuffers:

注意:此代码假设单声道浮点数据为 44.1kHz.
你可能想改变它.

N.B: this code assumes mono float data at 44.1kHz.
You might want to change that.

func copyAudioBufferBytes(_ audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
    let srcLeft = audioBuffer.floatChannelData![0]
    let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
    let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)

    // initialize bytes to 0 (how to avoid?)
    var audioByteArray = [UInt8](repeating: 0, count: numBytes)

    // copy data from buffer
    srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
        audioByteArray.withUnsafeMutableBufferPointer {
            $0.baseAddress!.initialize(from: srcByteData, count: numBytes)
        }
    }

    return audioByteArray
}

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
    // format assumption! make this part of your protocol?
    let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
    let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame

    let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
    audioBuffer.frameLength = frameLength

    let dstLeft = audioBuffer.floatChannelData![0]
    // for stereo
    // let dstRight = audioBuffer.floatChannelData![1]

    buf.withUnsafeBufferPointer {
        let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
        dstLeft.initialize(from: src, count: Int(frameLength))
    }

    return audioBuffer
}

这篇关于尝试通过多点连接将音频从麦克风流式传输到另一部手机的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆