iOS 8 iPad AVCaptureMovieFileOutput在录制13 - 14秒后丢失/丢失/永远不会获得音轨 [英] iOS 8 iPad AVCaptureMovieFileOutput drops / loses / never gets audio track after 13 - 14 seconds of recording

查看:508
本文介绍了iOS 8 iPad AVCaptureMovieFileOutput在录制13 - 14秒后丢失/丢失/永远不会获得音轨的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下代码适用于iOS 6& 7.x版本

I have the following code which works for iOS 6 & 7.x.

在iOS 8.1中我有一个奇怪的问题,如果你捕获一个会话大约13秒或更长时间,结果AVAsset只有1个轨道(视频),音频轨道只是没有。

In iOS 8.1 I have a strange issue where if you capture a session for about 13 seconds or longer, the resulting AVAsset only has 1 track (video), the audio track is just not there.

如果您录制的时间较短,AVAsset有2首曲目(视频和音频),符合预期。我有足够的磁盘空间,应用程序有权使用相机和麦克风。

If you record for a shorter period the AVAsset has 2 tracks (video and audio) as expected. I have plenty of disk space, the app has permission to use camera and microphone.

我用最少的代码创建了一个新项目,它重现了这个问题。

I created a new project with minimal code, it reproduced the issue.

任何想法都将不胜感激。

Any ideas would be greatly appreciated.

#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController
{
    enum RecordingState { Recording, Stopped };
    enum RecordingState recordingState;

    AVCaptureSession *session;
    AVCaptureMovieFileOutput *output;
    AVPlayer *player;
    AVPlayerLayer *playerLayer;
    bool audioGranted;
}

- (void)viewDidLoad {
    [super viewDidLoad];

    [self setupAV];
    recordingState = Stopped;
}

-(void)setupAV
{
    session = [[AVCaptureSession alloc] init];
    [session beginConfiguration];
    AVCaptureDevice *videoDevice = nil;

    for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
        if ( device.position == AVCaptureDevicePositionBack ) {
            videoDevice = device;
            break;
        }
    }
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    if (videoDevice && audioDevice)
    {
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
        [session addInput:input];

        AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice error:nil];
        [session addInput:audioInput];

        NSURL *recordURL = [self tempUrlForRecording];
        [[NSFileManager defaultManager] removeItemAtURL:recordURL error:nil];

        output= [[AVCaptureMovieFileOutput alloc] init];
        output.maxRecordedDuration = CMTimeMake(45, 1);
        output.maxRecordedFileSize = 1028 * 1028 * 1000;
        [session addOutput:output];
    }
    [session commitConfiguration];
}

- (IBAction)recordingButtonClicked:(id)sender {
    if(recordingState == Stopped)
    {
        [self startRecording];
    }
    else
    {
        [self stopRecording];
    }
}

-(void)startRecording
{
    recordingState = Recording;
    [session startRunning];
    [output startRecordingToOutputFileURL:[self tempUrlForRecording] recordingDelegate:self];

}

-(void)stopRecording
{
    recordingState = Stopped;
    [output stopRecording];
    [session stopRunning];
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    AVAsset *cameraInput = [AVAsset assetWithURL:[self tempUrlForRecording]];
    //DEPENDING ON HOW LONG RECORDED THIS DIFFERS (<14 SECS - 2 Tracks, >14 SECS - 1 Track)
    NSLog(@"Number of tracks: %i", cameraInput.tracks.count);
}

-(id)tempUrlForRecording
{
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectoryPath = [paths objectAtIndex:0];

    NSString *path = @"camerabuffer.mp4";
    NSString *pathCameraInput =[documentsDirectoryPath stringByAppendingPathComponent: path];
    NSURL *urlCameraInput = [NSURL fileURLWithPath:pathCameraInput];

    return urlCameraInput;
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

@end


推荐答案

这将帮助您解决问题。

This will help you to fix it.

[movieOutput setMovieFragmentInterval:kCMTimeInvalid];

我认为这是一个错误。文档说如果录制没有成功完成,则不会写入示例表。因此,如果它成功完成,它将自动写入。但现在看起来似乎没有。

I think this is a bug. The documentation says the sample table is not written if the recording does not complete successfully. So it will automatically be written if it does complete successfully. But now it seems like it doesn't.

任何想法?

这篇关于iOS 8 iPad AVCaptureMovieFileOutput在录制13 - 14秒后丢失/丢失/永远不会获得音轨的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆