记录和AVAssetWriter播放音频 [英] Record and play audio with AVAssetWriter

查看:348
本文介绍了记录和AVAssetWriter播放音频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经减少了这个问题相当多,我希望一些帮助。

基本上这个类有两个方法,一是开始录制音频( -recordMode )和其他播放音频( PLAYMODE )。我现在有这个类与调用相应的方法(拍摄,播放)两个按钮的单一视图控制器项目。没有其他的变量,类是自包含的。

但它不会PLAY / REC任何东西,我想不出为什么。当我尝试打我得到的0文件大小和一个错误的文件,因为你不能初始化的 AVAudioPlayer 当然参考。但我不明白为什么文件是空的,或者为什么 self.outputPath

.h文件中

 #进口< AVFoundation / AVFoundation.h>@interface MicCommunicator:NSObject的< AVCaptureAudioDataOutputSampleBufferDelegate>@属性(非原子,保留)NSURL * outputPath;
@属性(非原子,保留)AVCaptureSession * captureSession;
@属性(非原子,保留)AVCaptureAudioDataOutput *输出; - (无效)beginStreaming;
- (无效)播放模式;
- (无效)recordMode;@结束

.m文件:

  @implementation MicCommunicator {
    AVAssetWriter * assetWriter;
    AVAssetWriterInput * assetWriterInput;
}@synthesize captureSession = _captureSession;
@synthesize输出= _output;
@synthesize outputPath = _outputPath; - (ID)初始化{
    如果((个体经营= [超级的init])){
        NSArray的*搜索路径= NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
        self.outputPath = [NSURL fileURLWithPath:[搜索路径objectAtIndex:0] stringByAppendingPathComponent:@micOutput.output]];        AudioChannelLayout ACL;
        bzero(安培; ACL,的sizeof(ACL));
        acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; // kAudioChannelLayoutTag_Stereo;
        *的NSDictionary = audioOutputSettings [NSDictionary的dictionaryWithObjectsAndKeys:
                                             [NSNumber的numberWithInt:kAudioFormatULaw],AVFormatIDKey,
                                             [NSNumber的numberWithFloat:8000.0],AVSampleRateKey,//是44100.0
                                             [NSData的dataWithBytes:放大器; ACL长度:的sizeof(AudioChannelLayout),AVChannelLayoutKey,
                                             [的NSNumber numberWithInt:1],AVNumberOfChannelsKey,
                                             [NSNumber的numberWithInt:8000.0],艾文coderBitRateKey,
                                             零];        assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
        [assetWriterInput setExpectsMediaDataInRealTime:YES];        assetWriter = [[AVAssetWriter assetWriterWithURL:_outputPath的fileType:AVFileTypeWAVE错误:无]保留]
        [assetWriter addInput:assetWriterInput];
    }
    返回自我;
} - (无效){的dealloc
    [assetWriter发布]
    [超级的dealloc];
}// conveniance方法 - (无效)PLAYMODE
{
    [STO自我precording]。    NSError *错误;
    AVAudioPlayer * audioPlayer = [[AVAudioPlayer页头] initWithContentsOfURL:self.outputPath错误:&放大器;错误]
    audioPlayer.numberOfLoops = -1;    如果(audioPlayer ==无){
        的NSLog(@错误:%@,[错误描述]);
    }其他{
        的NSLog(@打);
        [audioPlayer播放];
    }
} - (无效)recordMode
{
        [个体经营beginStreaming]
} - (无效)STO precording
{
    [self.captureSession stopRunning]
    [assetWriterInput markAsFinished]
    [assetWriter finishWriting]    *的NSDictionary = outputFileAttributes [的NSFileManager defaultManager] attributesOfItemAtPath:[的NSString stringWithFormat:@%@,self.outputPath]错误:无];
    的NSLog(@。DONE文件大小为LLU%,[outputFileAttributes档案大小]);
}//开始录音
- (无效){beginStreaming
    self.captureSession = [[AVCaptureSession的alloc]初始化];
    AVCaptureDevice * audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    NSError *误差=零;
    AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice错误:&放大器;错误]
    如果(audioInput)
        [self.captureSession addInput:audioInput];
    其他{
        的NSLog(@没有音频输入找到。);
        返回;
    }    AVCaptureAudioDataOutput *输出= [[AVCaptureAudioDataOutput的alloc]初始化];    dispatch_queue_t outputQueue = dispatch_queue_create(micOutputDispatchQueue,NULL);
    [输出setSampleBufferDelegate:自队列:outputQueue];
    dispatch_release(outputQueue);    [self.captureSession addOutput:输出]
    [assetWriter startWriting]
    [self.captureSession startRunning]
}//回电话
- (无效)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)连接{
    AudioBufferList audioBufferList;
    NSMutableData *数据= [[NSMutableData的alloc]初始化];
    CMBlockBufferRef blockBuffer;
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer,NULL,&放大器; audioBufferList,sizeof的(audioBufferList),NULL,NULL,0,&放大器; blockBuffer);    //为(INT Y = 0; Y< audioBufferList.mNumberBuffers; Y ++){
    // AudioBuffer audioBuffer = audioBufferList.mBuffers [Y];
    //浮点32 *帧=(浮点32 *)audioBuffer.mData;
    //
    // [数据对appendBytes:帧长度:audioBuffer.mDataByteSize];
    //}    // [数据字节]追加到NSOutputStream
    //这两行写入到磁盘,你可能不需要这个,只是提供一个例子
    [assetWriter startSessionAtSourceTime:CMSampleBufferGet presentationTimeStamp(sampleBuffer)];
    [assetWriterInput appendSampleBuffer:sampleBuffer];    CFRelease(blockBuffer);
    blockBuffer = NULL;
    [数据发布]
}@结束


解决方案

每苹果技术支持:

这因此错误 - 在创建文件,样本数量都成功写入然后追加失败开始为一些未知的原因

似乎AVAssetWriter失败,只能用这些设置。

AudioQueue是应该用于ULAW音频

I have reduced this question quite a bit and am hoping for some help.

Basically this class has two methods, one to start recording audio (-recordMode) and the other to play audio (playMode). I currently have this class in a project with a single view controller with two buttons that call the corresponding methods (rec, play). There are no other variables, the class is self-contained.

However it will not play/rec anything and I cannot figure out why. When I try to play the file I get a file size of 0 and an error because you cant init the AVAudioPlayer with a nil reference of course. But I dont understand why the file is empty or why self.outputPath is nil.

.h file

#import <AVFoundation/AVFoundation.h>

@interface MicCommunicator : NSObject<AVCaptureAudioDataOutputSampleBufferDelegate>

@property(nonatomic,retain) NSURL *outputPath;
@property(nonatomic,retain) AVCaptureSession * captureSession;
@property(nonatomic,retain) AVCaptureAudioDataOutput * output;

-(void)beginStreaming;
-(void)playMode;
-(void)recordMode;

@end

.m file:

@implementation MicCommunicator {
    AVAssetWriter *assetWriter;
    AVAssetWriterInput *assetWriterInput;
}

@synthesize captureSession = _captureSession;
@synthesize output = _output;
@synthesize outputPath = _outputPath;

-(id)init {
    if ((self = [super init])) {
        NSArray *searchPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        self.outputPath = [NSURL fileURLWithPath:[[searchPaths objectAtIndex:0] stringByAppendingPathComponent:@"micOutput.output"]];

        AudioChannelLayout acl;
        bzero(&acl, sizeof(acl));
        acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono; //kAudioChannelLayoutTag_Stereo;
        NSDictionary *audioOutputSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                             [NSNumber numberWithInt: kAudioFormatULaw],AVFormatIDKey,        
                                             [NSNumber numberWithFloat:8000.0],AVSampleRateKey,//was 44100.0
                                             [NSData dataWithBytes: &acl length: sizeof( AudioChannelLayout ) ], AVChannelLayoutKey,
                                             [NSNumber numberWithInt:1],AVNumberOfChannelsKey,
                                             [NSNumber numberWithInt:8000.0],AVEncoderBitRateKey,
                                             nil];

        assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:audioOutputSettings];
        [assetWriterInput setExpectsMediaDataInRealTime:YES];

        assetWriter = [[AVAssetWriter assetWriterWithURL:_outputPath fileType:AVFileTypeWAVE error:nil] retain];
        [assetWriter addInput:assetWriterInput];
    }
    return self;
}

-(void)dealloc {
    [assetWriter release];
    [super dealloc];
}

//conveniance methods

-(void)playMode
{
    [self stopRecording];

    NSError *error;
    AVAudioPlayer * audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:self.outputPath error:&error];
    audioPlayer.numberOfLoops = -1;

    if (audioPlayer == nil){
        NSLog(@"error: %@",[error description]);        
    }else{ 
        NSLog(@"playing");  
        [audioPlayer play];
    }
}

-(void)recordMode
{
        [self beginStreaming];    
}

-(void)stopRecording
{
    [self.captureSession stopRunning];
    [assetWriterInput markAsFinished];
    [assetWriter  finishWriting];

    NSDictionary *outputFileAttributes = [[NSFileManager defaultManager] attributesOfItemAtPath:[NSString stringWithFormat:@"%@",self.outputPath] error:nil];
    NSLog (@"done. file size is %llu", [outputFileAttributes fileSize]);
}

//starts audio recording
-(void)beginStreaming {
    self.captureSession = [[AVCaptureSession alloc] init];
    AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    NSError *error = nil;
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
    if (audioInput)
        [self.captureSession addInput:audioInput];
    else {
        NSLog(@"No audio input found.");
        return;
    }

    AVCaptureAudioDataOutput *output = [[AVCaptureAudioDataOutput alloc] init];

    dispatch_queue_t outputQueue = dispatch_queue_create("micOutputDispatchQueue", NULL);
    [output setSampleBufferDelegate:self queue:outputQueue];
    dispatch_release(outputQueue);

    [self.captureSession addOutput:output];
    [assetWriter startWriting];
    [self.captureSession startRunning];
}

//callback
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    AudioBufferList audioBufferList;
    NSMutableData *data= [[NSMutableData alloc] init];
    CMBlockBufferRef blockBuffer;
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, NULL, &audioBufferList, sizeof(audioBufferList), NULL, NULL, 0, &blockBuffer);

    //for (int y = 0; y < audioBufferList.mNumberBuffers; y++) {
    //  AudioBuffer audioBuffer = audioBufferList.mBuffers[y];
    //  Float32 *frame = (Float32*)audioBuffer.mData;
    //          
    //  [data appendBytes:frame length:audioBuffer.mDataByteSize];
    //}

    // append [data bytes] to your NSOutputStream 


    // These two lines write to disk, you may not need this, just providing an example
    [assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
    [assetWriterInput appendSampleBuffer:sampleBuffer];

    CFRelease(blockBuffer);
    blockBuffer=NULL;
    [data release];
}

@end

解决方案

Per Apple Support:

this is therefore the bug -- the file is created, a number of samples are written successfully then append starts failing for some unknown reason.

It seems that AVAssetWriter fails only with these settings.

AudioQueue is what should be used for ulaw audio

这篇关于记录和AVAssetWriter播放音频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆