AVAudioPCMBuffer音乐文件 [英] AVAudioPCMBuffer for music files

查看:242
本文介绍了AVAudioPCMBuffer音乐文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直试图在我的SpriteKit游戏中播放音乐,并使用AVAudioPlayerNode类通过AVAudioPCMBuffer播放音乐.每次我导出OS X项目时,它都会崩溃,并给我一个有关音频播放的错误.在过去24小时里,我把头撞在墙上之后,我决定重新观看 WWDC会议501 (参见54:17).我解决此问题的方法是演示者使用的方法,该方法是将缓冲区的帧分解成较小的碎片,以分解正在读取的音频文件.

I've been trying to play music in my SpriteKit game and used the AVAudioPlayerNode class to do so via AVAudioPCMBuffers. Every time I exported my OS X project, it would crash and give me an error regarding audio playback. After banging my head against the wall for the last 24 hours I decided to re-watch WWDC session 501 (see 54:17). My solution to this problem was what the presenter used, which is to break the frames of the buffer into smaller pieces to break up the audio file being read.

NSError *error = nil;
NSURL *someFileURL = ...
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error];
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L;
AVAudioFramePosition fileLength = audioFile.length;

AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity];
while (audioFile.framePosition < fileLength) {
    AVAudioFramePosition readPosition = audioFile.framePosition;
    if (![audioFile readIntoBuffer: readBuffer error: &error])
        return NO;
    if (readBuffer.frameLength == 0) //end of file reached
        break;
}

我当前的问题是播放器仅播放读入缓冲区的最后一帧.我正在播放的音乐只有2分钟长.显然,这太长了,无法直接读入缓冲区.每次在循环内调用readIntoBuffer:方法时,缓冲区是否被覆盖?我对这个东西真是个菜鸟……如何播放整个文件?

My current problem is that the player only plays the last frame read into the buffer. The music that I'm playing is only 2 minutes long. Apparently, this is too long to just read into the buffer outright. Is the buffer being overwritten every time the readIntoBuffer: method is called inside the loop? I'm such a noob at this stuff...how can I get the entire file played?

如果我无法正常工作,那么在多个SKScene上播放音乐(2个不同文件)的一种好方法是什么?

If I can't get this to work, what is a good way to play music (2 different files) across multiple SKScenes?

推荐答案

这是我想出的解决方案.它仍然不是完美的,但是希望它可以帮助处于与我同样处境的人.我创建了一个单例课程来处理这项工作.将来可以进行的一项改进是仅在需要时加载特定SKScene所需的声音效果和音乐文件.我在这段代码中遇到了很多问题,以至于现在我不想把它弄乱了.目前,我没有太多声音,因此它没有占用过多的内存.

This is the solution that I came up with. It's still not perfect, but hopefully it will help someone who is in the same predicament that I've found myself in. I created a singleton class to handle this job. One improvement that can be made in the future is to only load sound effects and music files needed for a particular SKScene at the time they are needed. I had so many issues with this code that I don't want to mess with it now. Currently, I don't have too many sounds, so it's not using an excessive amount of memory.

概述
我的策略如下:

Overview
My strategy was the following:

  1. 将游戏的音频文件名称存储在plist中
  2. 从该plist中读取内容,并创建两个字典(一个用于音乐,一个用于短音效果)
  3. 声音效果字典由用于每种声音的AVAudioPCMBuffer和AVAudioPlayerNode组成.
  4. 音乐字典由AVAudioPCMBuffers数组,时间戳指示何时应在队列中播放这些缓冲区,AVAudioPlayerNode和原始音频文件的采样率组成.

  1. Store the audio file names for the game in a plist
  2. Read from that plist and create two dictionaries (one for music and one for short sound effects)
  3. The sound effect dictionary is composed of a AVAudioPCMBuffer and a AVAudioPlayerNode for each of the sounds
  4. The music dictionary is compose of an array of AVAudioPCMBuffers, an array of timestamps for when those buffers should be played in queue, a AVAudioPlayerNode and the sample rate of the original audio file

  • 要确定每个缓冲区的播放时间,采样率是必不可少的(您将在代码中看到计算结果)

创建一个AVAudioEngine并从引擎中获取主混音器,并将所有AVAudioPlayerNodes附着到混音器上(照常)

Create an AVAudioEngine and get the main mixer from the engine and attach all AVAudioPlayerNodes to the mixer (as per usual)

  • 声音效果的播放非常简单...调用方法-(void) playSfxFile:(NSString*)file; 并发出声音
  • 对于音乐,我只是在没有调用试图播放音乐的场景的帮助下找不到好的解决方案.场景将调用-(void) playMusicFile:(NSString*)file;,它将调度缓冲区的创建顺序.我找不到在AudioEngine类中完成播放后重复播放音乐的好方法,所以我决定去现场检查其update:方法是否在播放特定文件的音乐,以及是否不,请再次播放(不是一个很好的解决方案,但是它可以工作)
  • sound effect playing is straightforward...call method -(void) playSfxFile:(NSString*)file; and it plays a sound
  • for music, I just couldn't find a good solution without invoking the help of the scene trying to play the music. The scene will call -(void) playMusicFile:(NSString*)file;and it will schedule the buffers to play in order that they were created. I couldn't find a good way to get the music to repeat once completed within my AudioEngine class so I decided to get the scene to check in its update: method whether or not the music was playing for a particular file and if not, play it again (not a very slick solution, but it works)

AudioEngine.h

#import <Foundation/Foundation.h>

@interface AudioEngine : NSObject

+(instancetype)sharedData;
-(void) playSfxFile:(NSString*)file;
-(void) playMusicFile:(NSString*)file;
-(void) pauseMusic:(NSString*)file;
-(void) unpauseMusic:(NSString*)file;
-(void) stopMusicFile:(NSString*)file;
-(void) setVolumePercentages;
-(bool) isPlayingMusic:(NSString*)file;

@end

AudioEngine.m

#import "AudioEngine.h"
#import <AVFoundation/AVFoundation.h>
#import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount)

@interface AudioEngine()

@property AVAudioEngine *engine;
@property AVAudioMixerNode *mixer;

@property NSMutableDictionary *musicDict;
@property NSMutableDictionary *sfxDict;

@property NSString *audioInfoPList;

@property float musicVolumePercent;
@property float sfxVolumePercent;
@property float fadeVolume;
@property float timerCount;

@end

@implementation AudioEngine

int const FADE_ITERATIONS = 10;
static NSString * const MUSIC_PLAYER = @"player";
static NSString * const MUSIC_BUFFERS = @"buffers";
static NSString * const MUSIC_FRAME_POSITIONS = @"framePositions";
static NSString * const MUSIC_SAMPLE_RATE = @"sampleRate";

static NSString * const SFX_BUFFER = @"buffer";
static NSString * const SFX_PLAYER = @"player";

+(instancetype) sharedData {
    static AudioEngine *sharedInstance = nil;

    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        sharedInstance = [[self alloc] init];
        [sharedInstance startEngine];
    });

    return sharedInstance;
}

-(instancetype) init {
    if (self = [super init]) {
        _engine = [[AVAudioEngine alloc] init];
        _mixer = [_engine mainMixerNode];

        _audioInfoPList = [[NSBundle mainBundle] pathForResource:@"AudioInfo" ofType:@"plist"]; //open a plist called AudioInfo.plist

        [self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played
        [self initMusic];
        [self initSfx];
    }
    return self;
}

//opens all music files, creates multiple buffers depending on the length of the file and a player
-(void) initMusic {
    _musicDict = [NSMutableDictionary dictionary];

    _audioInfoPList = [[NSBundle mainBundle] pathForResource: @"AudioInfo" ofType: @"plist"];
    NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];

    for (NSString *musicFileName in audioInfoData[@"music"]) {
        [self loadMusicIntoBuffer:musicFileName];
        AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
        [_engine attachNode:player];

        AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0];
        [_engine connect:player to:_mixer format:buffer.format];
        [_musicDict[musicFileName] setObject:player forKey:@"player"];
    }
}

//opens a music file and creates an array of buffers
-(void) loadMusicIntoBuffer:(NSString *)filename
{
    NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:@"aif"];
    //NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"aif"]];
    NSAssert(audioFileURL, @"Error creating URL to audio file");
    NSError *error = nil;
    AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
    NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription);

    AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file
    float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file
    [_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename];
    [_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE];

    NSMutableArray *buffers = [NSMutableArray array];
    NSMutableArray *framePositions = [NSMutableArray array];

    const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size
    while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer
        [framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]];
        AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity];
        if (![audioFile readIntoBuffer:readBuffer error:&error]) {
            NSLog(@"failed to read audio file: %@", error);
            return;
        }
        if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop
            break;
        }
        [buffers addObject:readBuffer];
    }

    [_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS];
    [_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS];
}

-(void) initSfx {
    _sfxDict = [NSMutableDictionary dictionary];

    NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];

    for (NSString *sfxFileName in audioInfoData[@"sfx"]) {
        AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
        [_engine attachNode:player];

        [self loadSoundIntoBuffer:sfxFileName];
        AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER];
        [_engine connect:player to:_mixer format:buffer.format];
        [_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER];
    }
}

//WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space
-(void) loadSoundIntoBuffer:(NSString *)filename
{
    NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:@"mp3"]];
    NSAssert(audioFileURL, @"Error creating URL to audio file");
    NSError *error = nil;
    AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
    NSAssert(audioFile != nil, @"Error creating audioFile, %@", error.localizedDescription);

    AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
    [audioFile readIntoBuffer:readBuffer error:&error];

    [_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename];
    [_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER];
}

-(void)startEngine {
    [_engine startAndReturnError:nil];
}

-(void) playSfxFile:(NSString*)file {
    AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:@"player"];
    AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER];
    [player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil];
    [player setVolume:1.0];
    [player setVolume:_sfxVolumePercent];
    [player play];
}

-(void) playMusicFile:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];

    if ([player isPlaying] == NO) {
        NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS];

        double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue];


        for (int i = 0; i < [buffers count]; i++) {
            long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue];
            AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate];

            AVAudioPCMBuffer *buffer  = [buffers objectAtIndex:i];
            [player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
                if (i == [buffers count] - 1) {
                    [player stop];
                }
            }];
            [player setVolume:_musicVolumePercent];
            [player play];
        }
    }
}

-(void) stopOtherMusicPlayersNotNamed:(NSString*)file {
    if ([file isEqualToString:@"menuscenemusic"]) {
        AVAudioPlayerNode *player = [_musicDict[@"levelscenemusic"] objectForKey:MUSIC_PLAYER];
        [player stop];
    }
    else {
        AVAudioPlayerNode *player = [_musicDict[@"menuscenemusic"] objectForKey:MUSIC_PLAYER];
        [player stop];
    }
}

//stops the player for a particular sound
-(void) stopMusicFile:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];

    if ([player isPlaying]) {
        _timerCount = FADE_ITERATIONS;
        _fadeVolume = _musicVolumePercent;
        [self fadeOutMusicForPlayer:player]; //fade out the music
    }
}

//helper method for stopMusicFile:
-(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player {
    [NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(handleTimer:) userInfo:player repeats:YES];
}

//helper method for stopMusicFile:
-(void) handleTimer:(NSTimer*)timer {
    AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
    if (_timerCount > 0) {
        _timerCount--;
        AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
        _fadeVolume = _musicVolumePercent * (_timerCount / FADE_ITERATIONS);
        [player setVolume:_fadeVolume];
    }
    else {
        [player stop];
        [player setVolume:_musicVolumePercent];
        [timer invalidate];
    }
}

-(void) pauseMusic:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    if ([player isPlaying]) {
        [player pause];
    }
}

-(void) unpauseMusic:(NSString*)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    [player play];
}

//sets the volume of the player based on user preferences in GameData class
-(void) setVolumePercentages {
    NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:@"musicVolume"];
    _musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet:
    [[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
    componentsJoinedByString:@""] floatValue] / 100;
    NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:@"sfxVolume"];
    _sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet:
    [[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
    componentsJoinedByString:@""] floatValue] / 100;

    //immediately sets music to new volume
    for (NSString *file in [_musicDict allKeys]) {
        AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
        [player setVolume:_musicVolumePercent];
    }
}

-(bool) isPlayingMusic:(NSString *)file {
    AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
    if ([player isPlaying])
        return YES;
    return NO;
}

@end

这篇关于AVAudioPCMBuffer音乐文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆