暂停&在iOS中使用AVCaptureMovieFileOutput和AVCaptureVideoDataOutput恢复视频捕获 [英] Pause & resume video capture using AVCaptureMovieFileOutput and AVCaptureVideoDataOutput in iOS

查看:179
本文介绍了暂停&在iOS中使用AVCaptureMovieFileOutput和AVCaptureVideoDataOutput恢复视频捕获的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我必须实现在单个会话中重复暂停和恢复视频捕获的功能,但是将每个新段(每次暂停后捕获的段)添加到同一视频文件中,并使用 AVFoundation 。目前,每当我再次按停止然后再记录时,它只会将新的视频文件保存到我的iPhone的文档目录中,并开始捕获到新文件。我需要能够按下录制/停止按钮,只捕获视频和播放器。当记录处于活动状态时的音频...然后当按下完成按钮时,将一个包含所有段的AV文件放在一起。所有这些都需要在同一个捕获会话/预览会话中发生。

I have to implement functionality to repeatedly pause and resume video capture in a single session, but have each new segment (the captured segments after each pause) added to the same video file, with AVFoundation. Currently, every time I press "stop" then "record" again, it just saves a new video file to my iPhone's Document directory and starts capturing to a new file. I need to be able to press the "record/stop" button over, only capture video & audio when record is active... then when the "done" button is pressed, have a single AV file with all the segments together. And all this needs to happen in the same capture session / preview session.

我没有使用 AVAssetWriterInput

我能想到尝试这个的唯一方法是按下完成按钮,获取每个单独的输出文件并将它们组合成一个文件。

The only way I can think of to try this is when the "done" button is pressed, taking each individual output file and combining them together into a single file.

此代码适用于iOS 5但不适用于iOS 6.实际上适用于iOS 6,我第一次暂停录制(停止录制) AVCaptureFileOutputRecordingDelegate 方法( captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error:)被调用,但之后我开始录制委托方法( captureOutput:didFinishRecordingToOutputFileAtURL:fromConnections:error:)再次被调用,但在停止录制时不会调用它。

This code is working for iOS 5 but not for iOS 6. Actually for iOS 6, the first time when I pause recording (stop recording) AVCaptureFileOutputRecordingDelegate method (captureOutput: didFinishRecordingToOutputFileAtURL: fromConnections: error:) is called but after that when I start the recording the delegate method (captureOutput: didFinishRecordingToOutputFileAtURL: fromConnections: error:) is called again but it is not called at the time of stop recording.

我需要一个解决方案对于那个问题。请帮帮我。

I need a solution for that issue. Please help me.

//View LifeCycle
- (void)viewDidLoad
{
[super viewDidLoad];

self.finalRecordedVideoName = [self stringWithNewUUID];

arrVideoName = [[NSMutableArray alloc]initWithCapacity:0];
arrOutputUrl = [[NSMutableArray alloc] initWithCapacity:0];

CaptureSession = [[AVCaptureSession alloc] init];


captureDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
if ([captureDevices count] > 0)
{
    NSError *error;
    VideoInputDevice = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:&error];
    if (!error)
    {
        if ([CaptureSession canAddInput:VideoInputDevice])
            [CaptureSession addInput:VideoInputDevice];
        else
            NSLog(@"Couldn't add video input");
    }
    else
    {
        NSLog(@"Couldn't create video input");
    }
}
else
{
    NSLog(@"Couldn't create video capture device");
}



//ADD VIDEO PREVIEW LAYER
NSLog(@"Adding video preview layer");
AVCaptureVideoPreviewLayer *layer  = [[AVCaptureVideoPreviewLayer alloc] initWithSession:CaptureSession];

[self setPreviewLayer:layer];


UIDeviceOrientation currentOrientation = [UIDevice currentDevice].orientation;

NSLog(@"%d",currentOrientation);

if (currentOrientation == UIDeviceOrientationPortrait)
{
    PreviewLayer.orientation = AVCaptureVideoOrientationPortrait;
}
else if (currentOrientation == UIDeviceOrientationPortraitUpsideDown)
{
    PreviewLayer.orientation = AVCaptureVideoOrientationPortraitUpsideDown;
}
else if (currentOrientation == UIDeviceOrientationLandscapeRight)
{
    PreviewLayer.orientation = AVCaptureVideoOrientationLandscapeRight;
}
else if (currentOrientation == UIDeviceOrientationLandscapeLeft)
{
    PreviewLayer.orientation = AVCaptureVideoOrientationLandscapeLeft;
}

[[self PreviewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

//ADD MOVIE FILE OUTPUT
NSLog(@"Adding movie file output");
MovieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
VideoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[VideoDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

NSString* key = (NSString*)kCVPixelBufferBytesPerRowAlignmentKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];

[VideoDataOutput setVideoSettings:videoSettings];

Float64 TotalSeconds = 60;          //Total seconds
int32_t preferredTimeScale = 30;    //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale);//<<SET MAX DURATION
MovieFileOutput.maxRecordedDuration = maxDuration;
MovieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; //<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME

//SET THE CONNECTION PROPERTIES (output properties)
[self CameraSetOutputProperties];           //(We call a method as it also has to be done after changing camera)
AVCaptureConnection *videoConnection = nil;

for ( AVCaptureConnection *connection in [MovieFileOutput connections] )
{
    NSLog(@"%@", connection);
    for ( AVCaptureInputPort *port in [connection inputPorts] )
    {
        NSLog(@"%@", port);
        if ( [[port mediaType] isEqual:AVMediaTypeVideo] )
        {
            videoConnection = connection;
        }
    }
}

if([videoConnection isVideoOrientationSupported]) // **Here it is, its always false**
{
    [videoConnection setVideoOrientation:[[UIDevice currentDevice] orientation]];
}    NSLog(@"Setting image quality");
[CaptureSession setSessionPreset:AVCaptureSessionPresetLow];

//----- DISPLAY THE PREVIEW LAYER -----

CGRect layerRect = CGRectMake(5, 5, 299, ([[UIScreen mainScreen] bounds].size.height == 568)?438:348);

[self.PreviewLayer setBounds:layerRect];
[self.PreviewLayer setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];

if ([CaptureSession canAddOutput:MovieFileOutput])
    [CaptureSession addOutput:MovieFileOutput];
    [CaptureSession addOutput:VideoDataOutput];
//We use this instead so it goes on a layer behind our UI controls (avoids us having to manually bring each control to the front):
CameraView = [[UIView alloc] init];
[videoPreviewLayer addSubview:CameraView];
[videoPreviewLayer sendSubviewToBack:CameraView];
[[CameraView layer] addSublayer:PreviewLayer];

//----- START THE CAPTURE SESSION RUNNING -----
[CaptureSession startRunning];
}

#pragma mark - IBACtion Methods
-(IBAction)btnStartAndStopPressed:(id)sender
{
UIButton *StartAndStopButton = (UIButton*)sender;
if ([StartAndStopButton isSelected] == NO)
{
    [StartAndStopButton setSelected:YES];
    [btnPauseAndResume setEnabled:YES];
    [btnBack setEnabled:NO];
    [btnSwitchCameraInput setHidden:YES];

    NSDate *date = [NSDate date];
    NSLog(@" date %@",date);

    NSArray *paths                  =   NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *recordedFileName = nil;
    recordedFileName = [NSString stringWithFormat:@"output%@.mov",date];
    NSString *documentsDirectory    =   [paths objectAtIndex:0];
    self.outputPath                 =   [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@",recordedFileName]];
    NSLog(@"%@",self.outputPath);

    [arrVideoName addObject:recordedFileName];

    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:self.outputPath];
    if ([[NSFileManager defaultManager] fileExistsAtPath:self.outputPath])
    {
        NSError *error;
        if ([[NSFileManager defaultManager] removeItemAtPath:self.outputPath error:&error] == NO)
        {
            //Error - handle if requried
        }
    }
    //Start recording
    [MovieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
    recordingTimer = [NSTimer scheduledTimerWithTimeInterval:1.0 target:self selector:@selector(VideoRecording) userInfo:nil repeats:YES];

}
else
{
    [StartAndStopButton setSelected:NO];
    [btnPauseAndResume setEnabled:NO];
    [btnBack setEnabled:YES];
    [btnSwitchCameraInput setHidden:NO];

    NSLog(@"STOP RECORDING");
    WeAreRecording = NO;

    [MovieFileOutput stopRecording];
    [((ActOutAppDelegate *)ActOut_AppDelegate) showLoadingViewOnView:self.view withLabel:@"Please wait...."];

    if ([recordingTimer isValid])
    {
        [recordingTimer invalidate];
        recordingTimer = nil;
        recordingTime = 30;
    }

    stopRecording = YES;
}
}

- (IBAction)btnPauseAndResumePressed:(id)sender
{
UIButton *PauseAndResumeButton = (UIButton*)sender;
if (PauseAndResumeButton.selected == NO)
{
    PauseAndResumeButton.selected = YES;
    NSLog(@"recording paused");
    WeAreRecording = NO;

    [MovieFileOutput stopRecording];
    [self pauseTimer:recordingTimer];

    [btnStartAndStop setEnabled:NO];
    [btnBack setEnabled:YES];
    [btnSwitchCameraInput setHidden:NO];
}
else
{
    PauseAndResumeButton.selected = NO;
    NSLog(@"recording resumed");

    [btnStartAndStop setEnabled:YES];
    [btnBack setEnabled:NO];
    [btnSwitchCameraInput setHidden:YES];

    WeAreRecording = YES;

    NSDate *date = [NSDate date];
    NSLog(@" date %@",date);

    NSArray *paths                  =   NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
    NSString *recordedFileName = nil;
    recordedFileName = [NSString stringWithFormat:@"output%@.mov",date];
    NSString *documentsDirectory    =   [paths objectAtIndex:0];
    self.outputPath                 =   [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@",recordedFileName]];
    NSLog(@"%@",self.outputPath);

    [arrVideoName addObject:recordedFileName];

    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:self.outputPath];
    if ([[NSFileManager defaultManager] fileExistsAtPath:self.outputPath])
    {
        NSError *error;
        if ([[NSFileManager defaultManager] removeItemAtPath:self.outputPath error:&error] == NO)
        {
            //Error - handle if requried
        }
    }
    [self resumeTimer:recordingTimer];
    //Start recording
    [MovieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
}
}

- (void) CameraSetOutputProperties
{
//SET THE CONNECTION PROPERTIES (output properties)
AVCaptureConnection *CaptureConnection = [MovieFileOutput connectionWithMediaType:AVMediaTypeVideo];

[CaptureConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
//Set frame rate (if requried)
CMTimeShow(CaptureConnection.videoMinFrameDuration);
CMTimeShow(CaptureConnection.videoMaxFrameDuration);

if (CaptureConnection.supportsVideoMinFrameDuration)
    CaptureConnection.videoMinFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);
if (CaptureConnection.supportsVideoMaxFrameDuration)
    CaptureConnection.videoMaxFrameDuration = CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND);

CMTimeShow(CaptureConnection.videoMinFrameDuration);
CMTimeShow(CaptureConnection.videoMaxFrameDuration);
}

- (AVCaptureDevice *) CameraWithPosition:(AVCaptureDevicePosition) Position
{
NSArray *Devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *Device in Devices)
{
    if ([Device position] == Position)
    {
         NSLog(@"%d",Position);
        return Device;
    }
}
return nil;
}

#pragma mark - AVCaptureFileOutputRecordingDelegate Method

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{

if(videoWriterInput.readyForMoreMediaData && WeAreRecording) [videoWriterInput appendSampleBuffer:sampleBuffer];

for(AVCaptureConnection *captureConnection in [captureOutput connections])
{
    if ([captureConnection isVideoOrientationSupported])
    {
        AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationLandscapeLeft;
        [captureConnection setVideoOrientation:orientation];
    }
}     

 UIDeviceOrientation curOr = [[UIDevice currentDevice] orientation];

 CGAffineTransform t;

if (curOr == UIDeviceOrientationPortrait) 
{
     t = CGAffineTransformMakeRotation(-M_PI / 2);
} 
else if (curOr == UIDeviceOrientationPortraitUpsideDown)
{
     t = CGAffineTransformMakeRotation(M_PI / 2);
} 
else if (curOr == UIDeviceOrientationLandscapeRight) 
{
     t = CGAffineTransformMakeRotation(M_PI);
}
else
{
     t = CGAffineTransformMakeRotation(0);
}
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
NSLog(@"didFinishRecordingToOutputFileAtURL - enter");
NSLog(@"output file url : %@", [outputFileURL absoluteString]);

BOOL RecordedSuccessfully = YES;
if ([error code] != noErr)
{
    // A problem occurred: Find out if the recording was successful.
    id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
    if (value)
    {
        RecordedSuccessfully = [value boolValue];
    }
}
AVCaptureConnection *videoConnection=nil;
for ( AVCaptureConnection *connection in [MovieFileOutput connections] )
{
    NSLog(@"%@", connection);
    for ( AVCaptureInputPort *port in [connection inputPorts] )
    {
        NSLog(@"%@", port);
        if ( [[port mediaType] isEqual:AVMediaTypeVideo] )
        {
            videoConnection = connection;
        }
    }
}

if([videoConnection isVideoOrientationSupported]) // **Here it is, its always false**
{
    [videoConnection setVideoOrientation:[[UIDevice currentDevice] orientation]];
}    NSLog(@"Setting image quality");

NSData *videoData = [NSData dataWithContentsOfURL:outputFileURL];
[videoData writeToFile:self.outputPath atomically:NO];

[arrOutputUrl addObject:outputFileURL];

if (stopRecording)
{
    [self mergeMultipleVideo];
}
}

//Method to merge multiple audios
-(void)mergeMultipleVideo
{
mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

CMTime nextClipStartTime = kCMTimeZero;
NSLog(@"Array of output file url : %@", arrOutputUrl);
if (arrOutputUrl.count > 0)
{
    for(int i = 0 ;i < [arrOutputUrl count];i++)
    {
        AVURLAsset* VideoAsset = [[AVURLAsset alloc]initWithURL:[arrOutputUrl objectAtIndex:i] options:nil];

        CMTimeRange timeRangeInAsset;
        timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [VideoAsset duration]);

        [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, VideoAsset.duration) ofTrack:[[VideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
        nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration);
    }
}

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mov",self.finalRecordedVideoName]];
NSURL *url = [NSURL fileURLWithPath:myPathDocs];

AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL=url;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
    dispatch_async(dispatch_get_main_queue(), ^{
        [self exportDidFinish:exportSession path:myPathDocs];
    });
}];
}

-(void)exportDidFinish:(AVAssetExportSession*)session path:(NSString*)outputVideoPath
{
NSLog(@"session.status : %d",session.status);
if (session.status == AVAssetExportSessionStatusCompleted)
{
    NSURL *outputURL = session.outputURL;

    NSData *videoData = [NSData dataWithContentsOfURL:outputURL];
    [videoData writeToFile:outputVideoPath atomically:NO];

    if ([arrVideoName count] > 0)
    {
        for (int i = 0; i < [arrVideoName count]; i++)
        {
            NSArray* documentPaths  = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
            NSString* fullFilePath  = [[documentPaths objectAtIndex:0] stringByAppendingPathComponent: [NSString stringWithFormat:@"%@",[arrVideoName objectAtIndex:i]]];

            NSLog(@"Full path of file to be deleted: %@",fullFilePath);

            NSFileManager *fileManager  =   [NSFileManager defaultManager];
            NSError *error;

            if ([fileManager fileExistsAtPath:fullFilePath])
            {
                [fileManager removeItemAtPath:fullFilePath error:&error];
            }
        }
        [arrVideoName removeAllObjects];
    }
    if (arrOutputUrl.count > 0)
    {
        [arrOutputUrl removeAllObjects];
    }
    [((ActOutAppDelegate *)ActOut_AppDelegate) removeLoadingViewfromView:self.view];
    [self.view addSubview:afterRecordingPopupView];
}
}


推荐答案

看在AVCaptureConnection的启用属性。对于输出连接,请将enabled设置为NO,而不是停止会话。

Look at the AVCaptureConnection's enabled property. For your output connection, set enabled to NO instead of stopping the session.

这篇关于暂停&amp;在iOS中使用AVCaptureMovieFileOutput和AVCaptureVideoDataOutput恢复视频捕获的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆