如何从AVCapture保存电影 [英] How to save a movie from AVCapture

查看:123
本文介绍了如何从AVCapture保存电影的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

最近几天,我一直在试图找出AVCapture,并且正在努力保存视频。我的理解是,您调用 [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self]; ,然后稍后再调用 [movieFileOutput stopRecording]; 然后应调用委托方法-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError * )错误{。之后,我应该可以使用 UISaveVideoAtPathToSavedPhotosAlbum([outputFileURL path],nil,nil,nil); 来保存电影,但是显然我做得不正确。当我开始会话,然后 startRecordingToOutputFile 时,它立即调用委托 didFinishRecording 。我不知道为什么。这是我的代码:

I've been trying to figure out AVCapture the last couple of days and am struggling to save a video. My understanding is that you call [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self]; and then at a later time you can call [movieFileOutput stopRecording]; And it should then call the delegate method -(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{. After that I should be able to save the movie with something like UISaveVideoAtPathToSavedPhotosAlbum([outputFileURL path] ,nil,nil,nil); But apparently I'm not doing it correctly. When I start the session and then startRecordingToOutputFile it immediately calls the delegate didFinishRecording. I can't figure out why. Here is a my code:

-(void)viewDidAppear:(BOOL)animated{
        [super viewDidAppear:animated];
        session = [[AVCaptureSession alloc] init];
        [session beginConfiguration];
        session.sessionPreset = AVCaptureSessionPresetMedium;

        AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    captureVideoPreviewLayer.frame = self.imagePreview.bounds; //UIView *imagePreview
    [self.imagePreview.layer addSublayer:captureVideoPreviewLayer];

    AVCaptureDevice *device = [self getCamera];
    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input) {
        // Handle the error appropriately.
        NSLog(@"ERROR: trying to open camera: %@", error);
    }

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectoryPath = [paths objectAtIndex:0];

    movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

    NSString *archives = [documentsDirectoryPath stringByAppendingPathComponent:@"archives"];
    NSString *outputpathofmovie = [[archives stringByAppendingPathComponent:@"Test"] stringByAppendingString:@".mp4"];
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];

    [session addInput:input];
    [session addOutput:movieFileOutput];
    [session commitConfiguration];
    [session startRunning];
    [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
    [NSTimer timerWithTimeInterval:7 target:self selector:@selector(stopRun) userInfo:nil repeats:NO];
    /*
    [self initializeCamera];
     */
}
 -(void)stopRun{
        [movieFileOutput stopRecording];
    }

-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{
    NSLog(@"capture done url: %@",outputFileURL);
    UISaveVideoAtPathToSavedPhotosAlbum([outputFileURL path] ,nil,nil,nil);
}

-(AVCaptureDevice*)getCamera{
    NSArray *devices = [AVCaptureDevice devices];
    AVCaptureDevice *frontCamera;
    AVCaptureDevice *backCamera;
    for (AVCaptureDevice *device in devices) {
        NSLog(@"Device name: %@", [device localizedName]);
        if ([device hasMediaType:AVMediaTypeVideo]) {
            if ([device position] == AVCaptureDevicePositionBack) {
                NSLog(@"Device position : back");
                backCamera = device;
            }
            else {
                NSLog(@"Device position : front");
                frontCamera = device;
            }
        }
    }
    return frontCamera;
}

很抱歉,它太长了。我希望这段代码中的许多内容对其他人也有用。

Sorry that it's so lengthy. I hope a lot of this code can be useful to someone else.

推荐答案

免责声明:我不是Objective C程序员,并且不是自从我第一次阅读有关语言本身的信息以来,只有15天了。

Disclaimer: I am not an Objective C programmer and its only been 15 days since i first started reading about the language itself.

我必须做类似的事情。这是下面对我有用的代码。我从developer.apple.com中的其他堆栈溢出问题和示例中获取了它。我已经注释掉了我的工作原型不需要的代码。您可以使用它。

I had to do something similar. Here is the code below which worked for me. I grabed it from different stack overflow questions and examples in developer.apple.com. I have commented out the code which i didn't need for my working prototype. You can play around with it.

-(void) saveRecordingLocally
{
    dispatch_async([self sessionQueue], ^{
        if (![[self movieFileOutput] isRecording])
        {
            if ([[UIDevice currentDevice] isMultitaskingSupported])
            {
                // Setup background task. This is needed because the captureOutput:didFinishRecordingToOutputFileAtURL: callback is not received until AVCam returns to the foreground unless you request background execution time. This also ensures that there will be time to write the file to the assets library when AVCam is backgrounded. To conclude this background execution, -endBackgroundTask is called in -recorder:recordingDidFinishToOutputFileURL:error: after the recorded file has been saved.
                [self setBackgroundRecordingID:[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]];
            }

            // Update the orientation on the movie file output video connection before starting recording.
//          [[[self movieFileOutput] connectionWithMediaType:AVMediaTypeVideo] setVideoOrientation:[[(AVCaptureVideoPreviewLayer *)[[self previewView] layer] connection] videoOrientation]];

            // Turning OFF flash for video recording
            //[AVCamViewController setFlashMode:AVCaptureFlashModeOff forDevice:[[self videoDeviceInput] device]];

            // Start recording to a temporary file.
            NSString *outputFilePath = [NSTemporaryDirectory() stringByAppendingPathComponent:[@"movie" stringByAppendingPathExtension:@"mov"]];
            [[self movieFileOutput] startRecordingToOutputFileURL:[NSURL fileURLWithPath:outputFilePath] recordingDelegate:self];
        }
    });
}

//delegate
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    NSLog(@"OutputFileUrl %@", outputFileURL);
    if(error){
        NSLog(@"ERROR : %@", error);
    }
    UIBackgroundTaskIdentifier backgroundRecordingID = [self backgroundRecordingID];
    [self setBackgroundRecordingID:UIBackgroundTaskInvalid];
    [[[ALAssetsLibrary alloc] init] writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) {
        if (error)
            NSLog(@"%@", error);

        [[NSFileManager defaultManager] removeItemAtURL:outputFileURL error:nil];

        if (backgroundRecordingID != UIBackgroundTaskInvalid)
            [[UIApplication sharedApplication] endBackgroundTask:backgroundRecordingID];
    }];
}

这篇关于如何从AVCapture保存电影的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆