AVFoundation-仅合并视频,仅显示第一个 [英] AVFoundation - combine videos only one the first is displayed

查看:96
本文介绍了AVFoundation-仅合并视频,仅显示第一个的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试采用不同的方式来合并视频.我正在为每个转换创建新的轨道.

I am trying to take a different approach at combining videos. I am creating a new track for each transformation.

此代码的问题是,显示了第一个视频,所有其他视频均为黑色.

The problem with this code is that the first video is shown and all others are black.

整个部分的音频覆盖都是正确的.似乎视频没有被合成,因为文件的大小为5 M(应为25 M时). 5M大小与第一个剪辑和音轨的大小相关.所有的AVAsets似乎都是有效的.这些文件确实存在于文件系统上.这是代码:

The audio overlay is correct for the entire segment. It looks like the video is not brought in to the composition because the size of the file is 5 M when it should be about 25M. The 5M size correlates to the size of the first clip and the audio track. All of the AVAssets appear to be valid. The files do exist on the file system. Here is the code:

- (void)mergeVideos:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion; {


    //    NSMutableArray *instructions = [NSMutableArray new];
    CGSize size = CGSizeZero;
    CMTime currentstarttime = kCMTimeZero;

    int tracknumber = 1;
    int32_t commontimescale = 600;
    CMTime time = kCMTimeZero;

    AVMutableComposition *mutableComposition = [AVMutableComposition composition];
    NSMutableArray *instructions = [[NSMutableArray alloc] init];

    for (NSURL *assetUrl in assets) {

        AVAsset *asset = [AVAsset assetWithURL:assetUrl];

        NSLog(@"Number of tracks: %lu  Incremental track number %i", (unsigned long)[[asset tracks] count], tracknumber);

        // make sure the timescales are correct for these tracks
        CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);

        AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                           preferredTrackID:kCMPersistentTrackID_Invalid];

        AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;

        NSLog(@"Running time: value = %lld  timescale = %d", time.value, time.timescale);
        NSLog(@"Asset length: value = %lld  timescale = %d", asset.duration.value, asset.duration.timescale);
        NSLog(@"Converted Scale: value = %lld  timescale = %d", cliptime.value, cliptime.timescale);

        NSError *error;

        [videoCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(kCMTimeZero, time)];
        [videoCompositionTrack insertTimeRange:CMTimeRangeMake(time, cliptime)
                                       ofTrack:assetTrack
                                        atTime:time
                                         error:&error];
        if (error) {
            NSLog(@"Error - %@", error.debugDescription);
        }

        // this flips the video temporarily for the front facing camera
        AVMutableVideoCompositionLayerInstruction *inst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];

        // set the flipping trasform to the correct tracks
        if ((tracknumber == 2) || (tracknumber == 4) || (tracknumber == 6) || (tracknumber == 8) || (tracknumber == 10)) {
            CGAffineTransform transform = CGAffineTransformMakeRotation(M_PI);
            [inst setTransform:transform atTime:time];
        } else {
            CGAffineTransform transform = assetTrack.preferredTransform;
            [inst setTransform:transform atTime:time];
        }

        // don't block the other videos with your black - needs to be the incremental time
        [inst setOpacity:0.0 atTime:time];

        // add the instructions to the overall array
        [instructions addObject:inst];

        // increment the total time after w use it for this iteration
        time = CMTimeAdd(time, cliptime);

        if (CGSizeEqualToSize(size, CGSizeZero)) {
            size = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject.naturalSize;;
        }

        // incrememt the track counter
        tracknumber++;
    }

    AVMutableVideoCompositionInstruction *mainVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    mainVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);

    mainVideoCompositionInstruction.layerInstructions = instructions;

    // bring all of the video together in the main composition
    AVMutableVideoComposition *mainVideoComposition = [AVMutableVideoComposition videoComposition];
    mainVideoComposition.instructions = [NSArray arrayWithObject:mainVideoCompositionInstruction];

    // setup the audio
    AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];


    // Grab the path, make sure to add it to your project!
    NSURL *soundURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"bink-bink-lexus-3" ofType:@"aif"]];
    AVURLAsset *soundAsset = [AVURLAsset assetWithURL:soundURL];

    NSError *error;

    // add audio to the entire track
    [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, mutableComposition.duration)
                                   ofTrack:[soundAsset tracksWithMediaType:AVMediaTypeAudio][0]
                                    atTime:kCMTimeZero
                                     error:&error];

    // Set the frame duration to an appropriate value (i.e. 30 frames per second for video).
    //    mainVideoComposition.frameDuration = CMTimeMake(1, 30);
    mainVideoComposition.renderSize = size;

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths firstObject];
    int number = arc4random_uniform(10000);
    self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
                                                                      presetName:AVAssetExportPreset1280x720];

    exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
    //Set the output file type
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;


    dispatch_group_t group = dispatch_group_create();


    dispatch_group_enter(group);

    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_group_leave(group);

    }];

    dispatch_group_notify(group, dispatch_get_main_queue(), ^{

        NSLog(@"Export File (Final) - %@", self.outputFile);
        completion(self.outputFile);

    });

}

推荐答案

您的问题是,通过使用多个AVMutableCompositionTracks并在kCMTimeZero之后的某个时间插入一个时间范围,会导致每个后续轨道的媒体都出现在合成中在kCMTimeZero.您需要使用

Your problem is that by using multiple AVMutableCompositionTracks and inserting a time range at a time after kCMTimeZero, you are causing each subsequent track to have its media appear in the composition at kCMTimeZero. You need to use insertEmptyTimeRange: if you want to pursue this route. It will move the media for that particular track forward in time by the duration of the empty range you insert.

或者,一种更简单的方法是使用单个AVMutableCompositionTrack.

Or, a much much easier way would be to use a single AVMutableCompositionTrack.

这篇关于AVFoundation-仅合并视频,仅显示第一个的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆