黑色视频CAAnimation和AVFoundation AVAssetExportSession [英] Black Video CAAnimation and AVFoundation AVAssetExportSession

查看:85
本文介绍了黑色视频CAAnimation和AVFoundation AVAssetExportSession的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是整个AVFoundation视频编辑电路上的新手.

I'm a relative newbie on the whole AVFoundation video editing circuit.

我当前的测试应用是一个两屏幕应用程序,第一个屏幕进行AVFoundation视频录制(1.mov),第二个屏幕使您可以观看视频并使用CAAnimation在其中添加一些字幕.

My current test app is a two screen application, the first screen does an AVFoundation Video recording (1.mov), and the second screen lets you view the video and put some title credits on it with a CAAnimation.

1.mov视频文件以纵向记录,并保存在磁盘上,然后通过此例程运行,该例程应该在视频顶部为我提供标题.但是,我得到的只是一个正确尺寸的黑色视频,其时间长度带有CATextLayer.

The 1.mov video file is recorded in portrait saved to the disk and is then run though this routine which should give me a title on top of the video. However all that I get is a black video of the right dimensions, time length with the CATextLayer on it.

我很确定我缺少一些基本的知识.我确实有适当的代码可以处理整个风景肖像旋转.

I'm pretty sure I'm missing something basic. I do have code in place that should handle the whole landscape portrait rotation.

-(IBAction)ComposeMovie:(id)sender {
    NSLog (@"ComposeMovie");

    CALayer *aLayer = [CALayer layer];
    aLayer.Frame = CGRectMake(0, 0, videoSize.height, videoSize.width); 
    CALayer *bLayer = [CALayer layer]; 

    NSLog(@"Create the title"); 
    CATextLayer *titleLayer = [CATextLayer layer]; 
    titleLayer.string = @"SUDO make me a sandwich"; 
    titleLayer.font = [UIFont boldSystemFontOfSize:18].fontName; 
    titleLayer.backgroundColor = [UIColor whiteColor].CGColor; 
    titleLayer.foregroundColor = [UIColor blackColor].CGColor; 
    titleLayer.fontSize = 24; 
    titleLayer.alignmentMode = kCAAlignmentRight; 
    titleLayer.bounds = CGRectMake(videoSize.width, videoSize.height /6, 300, 32); 
    [aLayer addSublayer:titleLayer]; 

    NSURL *url = [NSURL fileURLWithPath:getCaptureMoviePath()]; //Hard coded path to the 1.mov file in the documents directory
    AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];

    AVMutableComposition *cmp = [[AVMutableComposition alloc] init] ;  
    AVMutableCompositionTrack *trackA = [cmp addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    NSError *error = nil ;
    AVAssetTrack *sourceVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    [trackA insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:sourceVideoTrack atTime:kCMTimeZero error:&error] ;
    AVMutableVideoComposition *animComp = [[AVMutableVideoComposition videoComposition] retain];
    animComp.renderSize = CGSizeMake(videoSize.height, videoSize.width); 
    animComp.frameDuration = CMTimeMake(1,30);

    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) ); 

    AVMutableVideoCompositionLayerInstruction* rotator = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]];
    CGAffineTransform translateToCenter = CGAffineTransformMakeTranslation( 0,-320);    
    CGAffineTransform rotateBy90Degrees = CGAffineTransformMakeRotation( M_PI_2);
    CGAffineTransform shrinkWidth = CGAffineTransformMakeScale(0.66, 1); // needed because Apple does a "stretch" by default - really, we should find and undo apple's stretch - I suspect it'll be a CALayer defaultTransform, or UIView property causing this
    CGAffineTransform finalTransform = CGAffineTransformConcat( shrinkWidth, CGAffineTransformConcat(translateToCenter, rotateBy90Degrees) );
    [rotator setTransform:finalTransform atTime:kCMTimeZero];

    instruction.layerInstructions = [NSArray arrayWithObject: rotator];
    animComp.instructions = [NSArray arrayWithObject: instruction];


    NSLog(@"Creating Animation"); 
    //animComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithAdditionalLayer: asTrackID:1];
    animComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithAdditionalLayer:aLayer asTrackID:2];
    animComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithAdditionalLayer:bLayer asTrackID:3]; 
    //AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [asset duration]);
    AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:trackA];
    //[layerInstruction setTrackID:1]; 

    /*CMTime startTime = CMTimeMake(3,1); 
    CMTime stopTime = CMTimeMake(5,1); 
    CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime); 
    */ 

    //AVMutableVideoCompositionLayerInstruction *passThroughLayer = AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
    CGAffineTransform rotationTransform = CGAffineTransformMakeRotation(degreesToRadians(90.0));
    CGAffineTransform rotateTranslate = CGAffineTransformTranslate(rotationTransform,320,0);
    [layerInstruction setTransform:rotateTranslate atTime:kCMTimeZero];

    [layerInstruction setOpacity:1.0 atTime:kCMTimeZero ];
    instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
    animComp.instructions = [NSArray arrayWithObject:instruction];

    CALayer *parentLayer = [CALayer layer]; 
    CALayer *videoLayer = [CALayer layer]; 
    parentLayer.frame = CGRectMake(0,0, videoSize.width, videoSize.height); 
    videoLayer.frame = CGRectMake(0,0, videoSize.width, videoSize.height); 
    [parentLayer addSublayer:aLayer]; 
    [parentLayer addSublayer:bLayer];
    [parentLayer addSublayer:videoLayer]; 

    animComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

    NSLog(@"Creating File"); 
        NSArray *docPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *tempPath = [docPaths objectAtIndex:0];
        NSLog(@"Temp Path: %@",tempPath);

        NSString *fileName = [NSString stringWithFormat:@"%@/render.MOV",tempPath];
        NSFileManager *fileManager = [NSFileManager defaultManager] ;
        if([fileManager fileExistsAtPath:fileName ]){
            NSError *ferror = nil ;
            BOOL success = [fileManager removeItemAtPath:fileName error:&ferror];
        }

        NSURL *exportURL = [NSURL fileURLWithPath:fileName];

        AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:cmp presetName:AVAssetExportPresetHighestQuality]  ;
        exporter.outputURL = exportURL;
        exporter.videoComposition = animComp ;
        exporter.outputFileType= AVFileTypeQuickTimeMovie ;
        [exporter exportAsynchronouslyWithCompletionHandler:^(void){
            switch (exporter.status) {
                case AVAssetExportSessionStatusFailed:{
                    NSLog(@"Fail");
                    break;
                }
                case AVAssetExportSessionStatusCompleted:{
                    NSLog(@"Success");
                    break;
                }

                default:
                    break;
            }
        }];



    NSLog(@"End ComposeMovie"); 


}

推荐答案

但是,如果视频的大小不会是320X480,则会混乱.尝试使用此视频获取视频大小.

But if the size of the video wont be 320X480, it will mess up. Try to use this one to get the video size.

        CGSize videoSize = [[[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize];

这篇关于黑色视频CAAnimation和AVFoundation AVAssetExportSession的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆