如何保存音调和速度变化的iOS音频? [英] How to save the audio with changed pitch and speed iOS?

查看:83
本文介绍了如何保存音调和速度变化的iOS音频?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我可以更改音频的音高和速度,但是在保存音频时音调和速度上却遇到问题

I'm able to change the pitch and speed of my audio but I'm getting problem in saving the audio with changed pitch and speed

//this is method which set the pitch  
[self.audioEngine connect:audioPlayerNode
                       to:timePitchEffect
                   format:nil];
[self.audioEngine connect:timePitchEffect
                       to:self.audioEngine.outputNode
                   format:nil];

[audioPlayerNode scheduleFile:self.audioFile
                       atTime:nil
            completionHandler:nil];
[self.audioEngine startAndReturnError:&audioEngineError];
NSLog(@"%@",self.audioFile.url);
if (audioEngineError) {
    NSLog(@"%@",@"whats this!!!");
}

//在按钮点击时调用方法

// call the method on button tap

[self playAudioWithEffect:EAudioEffectPitch effectValue:@-500.0];

我正在使用此功能更改音高,但是如何使用更改后的音高保存它....请帮助...

I'm changing the pitch using this function but how to save it with changed pitch .... please help...

推荐答案

您可以在 AVAudioEngine 中启用离线手动渲染模式,在这种模式下,引擎的输入和输出节点与音频硬件断开连接渲染由您的应用程序驱动.

You can enable offline manual rendering mode in AVAudioEngine,In this mode, the engine’s input and output nodes are disconnected from the audio hardware and the rendering is driven by your app.

准备源音频

let sourceFile: AVAudioFile
let format: AVAudioFormat
do {
    let sourceFileURL = Bundle.main.url(forResource: "YOUR_AUDIO_NAME", withExtension: "caf")!
    sourceFile = try AVAudioFile(forReading: sourceFileURL)
    format = sourceFile.processingFormat
} catch {
    fatalError("Unable to load the source audio file: \(error.localizedDescription).")
}

创建和配置音频引擎

let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let reverb = AVAudioUnitReverb()

engine.attach(player)
engine.attach(reverb)

// Set the desired reverb parameters.
reverb.loadFactoryPreset(.mediumHall)
reverb.wetDryMix = 50

// Connect the nodes.
engine.connect(player, to: reverb, format: format)
engine.connect(reverb, to: engine.mainMixerNode, format: format)

// Schedule the source file.
player.scheduleFile(sourceFile, at: nil)

启用离线手动渲染模式

do {
    // The maximum number of frames the engine renders in any single render call.
    let maxFrames: AVAudioFrameCount = 4096
    try engine.enableManualRenderingMode(.offline, format: format,
                                         maximumFrameCount: maxFrames)
} catch {
    fatalError("Enabling manual rendering mode failed: \(error).")
}
//start the engine
do {
    try engine.start()
    player.play()
} catch {
    fatalError("Unable to start audio engine: \(error).")
}

准备输出目标

// The output buffer to which the engine renders the processed data.
let buffer = AVAudioPCMBuffer(pcmFormat: engine.manualRenderingFormat,
                              frameCapacity: engine.manualRenderingMaximumFrameCount)!

let outputFile: AVAudioFile
do {
    let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
    let outputURL = documentsURL.appendingPathComponent("Rhythm-processed.caf")
    outputFile = try AVAudioFile(forWriting: outputURL, settings: sourceFile.fileFormat.settings)
} catch {
    fatalError("Unable to open output audio file: \(error).")
}

手动渲染音频

while engine.manualRenderingSampleTime < sourceFile.length {
    do {
        let frameCount = sourceFile.length - engine.manualRenderingSampleTime
        let framesToRender = min(AVAudioFrameCount(frameCount), buffer.frameCapacity)
        
        let status = try engine.renderOffline(framesToRender, to: buffer)
        
        switch status {
            
        case .success:
            // The data rendered successfully. Write it to the output file.
            try outputFile.write(from: buffer)
            
        case .insufficientDataFromInputNode:
            // Applicable only when using the input node as one of the sources.
            break
            
        case .cannotDoInCurrentContext:
            // The engine couldn't render in the current render call.
            // Retry in the next iteration.
            break
            
        case .error:
            // An error occurred while rendering the audio.
            fatalError("The manual rendering failed.")
        }
    } catch {
        fatalError("The manual rendering failed: \(error).")
    }
}

// Stop the player node and engine.
player.stop()
engine.stop()

参考链接

这篇关于如何保存音调和速度变化的iOS音频?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆