视频录制在iphone sdk中 [英] Video recording in iphone sdk programatically

查看:196
本文介绍了视频录制在iphone sdk中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

$ <$> =从前置摄像头捕获视频 + 录制来自视频的音频(通过视频播放器播放的 )。请参阅附件屏幕截图。





使用我的代码块,如下所示:最后我得到的是视频,但没有音频。 >

但我想尝试实现的是最终录制视频,必须是组合:' A从我的前置摄像头捕获的视频 + 只记录正在播放的视频文件的音频



帮助或指导我如何实现上述功能。



录制 按钮单击方法如下:

   - (void)startRecording 
{self binclude} ;

NSURL * url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:@video
ofType:@mp4]];
[self playMovieAtURL:url];

[self startVideoRecording];
}

initCaptureSession :使用此方法使用前置摄像头使用
AVCaptureSession录制视频

   - (void)initCaptureSession 
{
NSLog(@设置捕获会话);
captureSession = [[AVCaptureSession alloc] init];

NSLog(@添加视频输入);

AVCaptureDevice * VideoDevice = [self frontFacingCameraIfAvailable];

if(VideoDevice)
{
NSError * error;
videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:& error];
if(!error)
{
if([captureSession canAddInput:videoInputDevice])
{
[captureSession addInput:videoInputDevice];
}
else
{
NSLog(@无法添加视频输入);
}
}
else
{
NSLog(@无法创建视频输入);
}
}
else
{
NSLog(@无法创建视频捕获设备);
}


NSLog(@添加音频输入);
AVCaptureDevice * audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeMuxed];
NSError * error = nil;
AVCaptureDeviceInput * audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:& error];
if(audioInput)
{
[captureSession addInput:audioInput];
}


NSLog(@添加电影文件输出);
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024; //<<<设置字节中的最小空格,用于记录在卷上继续

if([captureSession canAddOutput:movieFileOutput])
[captureSession addOutput:movieFileOutput];

[self CameraSetOutputProperties]; //(我们称为方法,因为它也必须在更改相机后完成)

NSLog(@设置图像质量);
[captureSession setSessionPreset:AVCaptureSessionPresetMedium];
if([captureSession canSetSessionPreset:AVCaptureSessionPreset640x480])//在设置它们之前检查基于大小的配置
[captureSession setSessionPreset:AVCaptureSessionPreset640x480];

[captureSession startRunning];
}

- (void)CameraSetOutputProperties
{
AVCaptureConnection * CaptureConnection = nil;

NSComparisonResult order = [[UIDevice currentDevice] .systemVersion compare:@5.0.0options:NSNumericSearch];
if(order == NSOrderedSame || order == NSOrderedDescending){
// OS version> = 5.0.0
CaptureConnection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
} else {
// OS version< 5.0.0
CaptureConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[movieFileOutput connections]];

}

//设置横向(如果需要)
if([CaptureConnection isVideoOrientationSupported])
{
AVCaptureVideoOrientation orientation = AVCaptureVideoOrientationPortrait; // AVCaptureVideoOrientationLandscapeRight; //<<<<<<< SET VIDEO ORIENTATION IF LANDSCAPE
[CaptureConnection setVideoOrientation:orientation];
}

}

- )playMovieAtURL:(NSURL *)theURL 通过使用此方法我正在播放视频

   - (void) playMovieAtURL:(NSURL *)theURL 
{

player =
[[MPMoviePlayerController alloc] initWithContentURL:theURL];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];

player.scalingMode = MPMovieScalingModeAspectFill;
player.controlStyle = MPMovieControlStyleNone;
[player prepareToPlay];

[[NSNotificationCenter defaultCenter]
addObserver:self
selector:@selector(myMovieFinishedCallback :)
name:MPMoviePlayerPlaybackDidFinishNotification
object:player];
player.view.frame = CGRectMake(10,30,300,200);
[self.view addSubview:player.view];

[player play];
}

startVideoRecording 录制最终视频。

   - (void)startVideoRecording 
{
//创建临时URL
NSString * outputPath = [[NSString alloc] initWithFormat:@%@%@,NSTemporaryDirectory(),@output.mov];
NSURL * outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager * fileManager = [NSFileManager defaultManager];
if([fileManager fileExistsAtPath:outputPath])
{
NSError * error;
if([fileManager removeItemAtPath:outputPath error:& error] == NO)
{
//错误 - 如果需要处理
NSLog(@file remove error ;
}
}
//开始录制
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:
错误:(NSError *)错误
{

NSLog(@didFinishRecordingToOutputFileAtURL - enter);

BOOL RecordedSuccessfully = YES;
if([error code]!= noErr)
{
//发生问题:查看录音是否成功。
id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
if(value)
{
RecordedSuccessfully = [value boolValue];
}
}
if(RecordedSuccessfully)
{
// ----- RECORDED SUCESSFULLY -----
NSLog(@didFinishRecordingToOutputFileAtURL - 成功);
ALAssetsLibrary * library = [[ALAssetsLibrary alloc] init];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
completionBlock:^(NSURL * assetURL,NSError * error)
{
if(error)
{
NSLog(@文件保存错误);
}
else
{
recordedVideoURL = assetURL;
}
}];
}
else
{

NSString * assetURL = [self copyFileToDocuments:outputFileURL];
if(assetURL!= nil)
{
recordedVideoURL = [NSURL URLWithString:assetURL];
}
}
}
}


解决方案

//为以下方法添加一些额外的代码第一个方法

   - (void)playMovieAtURL:(NSURL *)theURL 

{
[player play];
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
NSError * err = nil;
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:& err];
if(err)
{
NSLog(@audioSession:%@%d%@,[err domain],[err code],[[err userInfo] description]
return;
}
[audioSession setActive:YES error:& err];
err = nil;
if(err){
NSLog(@audioSession:%@%d%@,[err domain],[err code],[[err userInfo] description]
return;
}

recordSetting = [[NSMutableDictionary alloc] init];

[recordSetting setValue:[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:16000.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt:1] forKey:AVNumberOfChannelsKey];
recorderFilePath = [NSString stringWithFormat:@%@ / MySound.caf,DOCUMENTS_FOLDER];
NSLog(@recorderFilePath:%@,recorderFilePath);
audio_url = [NSURL fileURLWithPath:recorderFilePath];
err = nil;
NSData * audioData = [NSData dataWithContentsOfFile:[audio_url path] options:0 error:& err];
if(audioData)
{
NSFileManager * fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[audio_url path] error:& err];
}

err = nil;
recorder = [[AVAudioRecorder alloc] initWithURL:audio_url settings:recordSetting error:& err];
if(!recorder)
{
NSLog(@recorder:%@%d%@,[err domain],[err code],[[err userInfo] description] ;
UIAlertView * alert =
[[UIAlertView alloc] initWithTitle:@Warning
message:[err localizedDescription]
delegate:nil
cancelButtonTitle:@OK
otherButtonTitles:nil];
[alert show];
return;
}

//准备记录
[recorder setDelegate:self];
[recorder prepareToRecord];
recorder.meteringEnabled = YES;

BOOL audioHWAvailable = audioSession.inputAvailable;
if(!audioHWAvailable)
{
UIAlertView * cantRecordAlert =
[[UIAlertView alloc] initWithTitle:@Warning
message:@音频输入硬件不可用
delegate:nil
cancelButtonTitle:@OK
otherButtonTitles:nil];
[cantRecordAlert show];
return;
}


}

//第二个方法

   - (void)stopVideoRecording 

{
[player.view removeFromSuperview];
[player stop];
[movieFileOutput stopRecording];

AVURLAsset * audioAsset = [[AVURLAsset alloc] initWithURL:audio_url options:nil];
AVURLAsset * videoAsset = [[AVURLAsset alloc] initWithURL:outputURL options:nil];

mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack * compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];

AVMutableCompositionTrack * compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];

AVAssetExportSession * _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetPassthrough];

AVAssetTrack * videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack setPreferredTransform:videoTrack.preferredTransform];
}

// Final Play Video

  AVPlayerItem * playerItem = [AVPlayerItem playerItemWithAsset:mixComposition]; 
AVPlayer * player1 = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer * playerLayer = [AVPlayerLlayer playerLayerWithPlayer:player1];
[playerLayer setFrame:CGRectMake(0,0,320,480)];
[[[self view] layer] addSublayer:playerLayer];
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[player1 play];
player1.actionAtItemEnd = AVPlayerActionAtItemEndNone;


I am trying to implement functionality like below

Final Recorded Video = "Capture a video from front camera + Record an audio from video (which i am playing through video player)".

For more understanding please see the attach screen shot.

Using my blocks of codes which is given below : At the end what i get is A video but without audio.

But what i want to trying to implement is "Final recorded video which must be combination of : 'A video which is captured from my front camera + Record only audio from video file which i am playing.'"

Can anyone help or guide me how can i achieve above functionality. Any help will be appreciated.

This is my code.

"Recording" Button Click Method is as following :

-(void) startRecording
{
    [self initCaptureSession];

    NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
                                         pathForResource:@"video"
                                         ofType:@"mp4"]];
    [self playMovieAtURL:url];

    [self startVideoRecording];
}

"initCaptureSession" : Using this method i am recording a video using front camera using "AVCaptureSession"

-(void) initCaptureSession
{
    NSLog(@"Setting up capture session");
    captureSession = [[AVCaptureSession alloc] init];

    NSLog(@"Adding video input");

    AVCaptureDevice *VideoDevice =  [self frontFacingCameraIfAvailable ];

    if (VideoDevice)
    {
        NSError *error;
        videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:VideoDevice error:&error];
        if (!error)
        {
            if ([captureSession canAddInput:videoInputDevice])
            {
                [captureSession addInput:videoInputDevice];
            }
            else
            {
                NSLog(@"Couldn't add video input");
            }
        }
        else
        {
            NSLog(@"Couldn't create video input");
        }
    }
    else
    {
        NSLog(@"Couldn't create video capture device");
    }


    NSLog(@"Adding audio input");
    AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeMuxed];
    NSError *error = nil;
    AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error];
    if (audioInput)
    {
        [captureSession addInput:audioInput];
    }


    NSLog(@"Adding movie file output");
    movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];

    movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024;    //<<SET MIN FREE SPACE IN BYTES FOR RECORDING TO CONTINUE ON A VOLUME

    if ([captureSession canAddOutput:movieFileOutput])
        [captureSession addOutput:movieFileOutput];

    [self CameraSetOutputProperties];           //(We call a method as it also has to be done after changing camera)

    NSLog(@"Setting image quality");
    [captureSession setSessionPreset:AVCaptureSessionPresetMedium];
    if ([captureSession canSetSessionPreset:AVCaptureSessionPreset640x480])     //Check size based configs are supported before setting them
        [captureSession setSessionPreset:AVCaptureSessionPreset640x480];

    [captureSession startRunning];
}

- (void) CameraSetOutputProperties
{
    AVCaptureConnection *CaptureConnection=nil;

    NSComparisonResult order = [[UIDevice currentDevice].systemVersion compare: @"5.0.0" options: NSNumericSearch];
    if (order == NSOrderedSame || order == NSOrderedDescending) {
        // OS version >= 5.0.0
        CaptureConnection = [movieFileOutput connectionWithMediaType:AVMediaTypeVideo];
    } else {
        // OS version < 5.0.0
        CaptureConnection = [self connectionWithMediaType:AVMediaTypeVideo fromConnections:[movieFileOutput connections]];

    }

    //Set landscape (if required)
    if ([CaptureConnection isVideoOrientationSupported])
    {
        AVCaptureVideoOrientation orientation =  AVCaptureVideoOrientationPortrait;// AVCaptureVideoOrientationLandscapeRight;      //<<<<<SET VIDEO ORIENTATION IF LANDSCAPE
        [CaptureConnection setVideoOrientation:orientation];
    }

   }

"-(void) playMovieAtURL: (NSURL*) theURL " By using this method i am playing a video

-(void) playMovieAtURL: (NSURL*) theURL
{

player =
[[MPMoviePlayerController alloc] initWithContentURL: theURL ];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];

player.scalingMode = MPMovieScalingModeAspectFill;
player.controlStyle = MPMovieControlStyleNone;
[player prepareToPlay];

[[NSNotificationCenter defaultCenter]
 addObserver: self
 selector: @selector(myMovieFinishedCallback:)
 name: MPMoviePlayerPlaybackDidFinishNotification
 object: player];
player.view.frame=CGRectMake(10, 30, 300, 200);
[self.view addSubview:player.view];

[player play];
}

"startVideoRecording" using this method i have started recording the Final video.

- (void) startVideoRecording
{
    //Create temporary URL to record to
    NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"];
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
    NSFileManager *fileManager = [NSFileManager defaultManager];
    if ([fileManager fileExistsAtPath:outputPath])
    {
        NSError *error;
        if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
        {
            //Error - handle if requried
            NSLog(@"file remove error");
        }
    }
    //Start recording
    [movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];

}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
      fromConnections:(NSArray *)connections
                error:(NSError *)error
{

    NSLog(@"didFinishRecordingToOutputFileAtURL - enter");

    BOOL RecordedSuccessfully = YES;
    if ([error code] != noErr)
    {
        // A problem occurred: Find out if the recording was successful.
        id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
        if (value)
        {
            RecordedSuccessfully = [value boolValue];
        }
    }
    if (RecordedSuccessfully)
    {
        //----- RECORDED SUCESSFULLY -----
        NSLog(@"didFinishRecordingToOutputFileAtURL - success");
        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
        {
            [library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
                                        completionBlock:^(NSURL *assetURL, NSError *error)
             {
                 if (error)
                 {
                     NSLog(@"File save error");
                 }
                 else
                 {
                     recordedVideoURL=assetURL;
                 }
             }];
        }
        else
        {

            NSString *assetURL=[self copyFileToDocuments:outputFileURL];
            if(assetURL!=nil)
            {
                recordedVideoURL=[NSURL URLWithString:assetURL];
            }
        }
    }
}

解决方案

// Add some extra code for following methods "1st Method"

  -(void) playMovieAtURL: (NSURL*) theURL

    {
       [player play];
       AVAudioSession *audioSession = [AVAudioSession sharedInstance];
       NSError *err = nil;
       [audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];
     if(err)
      {
        NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo]     description]);
        return;
    }
       [audioSession setActive:YES error:&err];
       err = nil;
    if(err){
        NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        return;
    }

       recordSetting = [[NSMutableDictionary alloc] init];

      [recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
      [recordSetting setValue:[NSNumber numberWithFloat:16000.0] forKey:AVSampleRateKey];
      [recordSetting setValue:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey];
      recorderFilePath = [NSString stringWithFormat:@"%@/MySound.caf", DOCUMENTS_FOLDER];
     NSLog(@"recorderFilePath: %@",recorderFilePath);
     audio_url = [NSURL fileURLWithPath:recorderFilePath];
    err = nil;
    NSData *audioData = [NSData dataWithContentsOfFile:[audio_url path] options: 0 error:&err];
    if(audioData)
    {
        NSFileManager *fm = [NSFileManager defaultManager];
        [fm removeItemAtPath:[audio_url path] error:&err];
    }

    err = nil;
    recorder = [[ AVAudioRecorder alloc] initWithURL:audio_url settings:recordSetting error:&err];
    if(!recorder)
    {
        NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        UIAlertView *alert =
        [[UIAlertView alloc] initWithTitle: @"Warning"
                                   message: [err localizedDescription]
                                  delegate: nil
                         cancelButtonTitle:@"OK"
                         otherButtonTitles:nil];
        [alert show];
        return;
    }

    //prepare to record
    [recorder setDelegate:self];
    [recorder prepareToRecord];
    recorder.meteringEnabled = YES;

    BOOL audioHWAvailable = audioSession.inputAvailable;
    if (! audioHWAvailable)
    {
        UIAlertView *cantRecordAlert =
        [[UIAlertView alloc] initWithTitle: @"Warning"
                                   message: @"Audio input hardware not available"
                                  delegate: nil
                         cancelButtonTitle:@"OK"
                         otherButtonTitles:nil];
        [cantRecordAlert show];
        return;
    }


}

// 2nd method

-(void) stopVideoRecording

    {
    [player.view removeFromSuperview];
    [player stop];
    [movieFileOutput stopRecording];

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_url options:nil];
    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:outputURL options:nil];

    mixComposition = [AVMutableComposition composition];

    AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
                                        ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
                                         atTime:kCMTimeZero error:nil];

    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
                                   ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                                    atTime:kCMTimeZero error:nil];

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                          presetName:AVAssetExportPresetPassthrough];

    AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    [compositionVideoTrack setPreferredTransform:videoTrack.preferredTransform];
}

// Final Play Video

AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
AVPlayer *player1 = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player1];
[playerLayer setFrame:CGRectMake(0, 0, 320, 480)];
[[[self view] layer] addSublayer:playerLayer];
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[player1 play];
player1.actionAtItemEnd = AVPlayerActionAtItemEndNone;

这篇关于视频录制在iphone sdk中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆