iOS 合并三个视频 - 旋转中心视频 [英] iOS Combine three videos - rotate the center video

查看:27
本文介绍了iOS 合并三个视频 - 旋转中心视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有三个视频.第一个来自后置摄像头.第二个来自前置摄像头,第三个来自后置摄像头.视频始终以横向模式拍摄,主页按钮位于右侧.

I have three videos. The first is from the rear camera. The second is from the front camera and the third is again from the rear camera. The videos are always taken in landscape mode with the home button on the right.

背面视频的方向正确.使用前置摄像头拍摄的中心视频旋转 180 度(倒置).我一直在研究和尝试多种方法来转换中心视频,但没有运气.我每次都得到相同的结果.

The rear facing videos are in correct orientation. The center video, taken using the front camera, is rotated at 180degrees (upside down). I have been researching and trying numerous methods to transform the center video with no luck. I get the same results every time.

我对整个过程感到非常沮丧.我在网上阅读的所有内容以及这里评论者的评论/建议都应该有效,但它不起作用.无论我尝试什么转换,视频都是一样的.它一直表现得好像我没有应用任何转换.没有.我不明白为什么在这方面忽略了转换.我已经花了数周的时间,我已经结束了 - 它根本不起作用.

I am getting pretty frustrated with this whole process. Everything I read online and the comments/suggestions from the reviewer here should work but it does not work. The video is the same no matter what I try for transformations. It continually acts as if I did not apply any transformations. Nothing. I do not understand why the transformations are ignored on this. I have spent weeks on this and I am at the end - it simply does not work.

这是我的代码的当前迭代:

Here is the current iteration of my code:

- (void)mergeVideos2:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion {

    AVMutableComposition *mutableComposition = [AVMutableComposition composition];
    AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];
    __block NSMutableArray *instructions = [[NSMutableArray alloc] init];
    __block CGSize size = CGSizeZero;
    __block CMTime time = kCMTimeZero;

    __block AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];

    __block CGAffineTransform transformflip = CGAffineTransformMakeScale(1, -1);
    //    __block CGAffineTransform transformflip = CGAffineTransformMakeRotation(M_PI);

    __block int32_t commontimescale = 600;

    [assets enumerateObjectsUsingBlock:^(id  _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {

        NSURL *assetUrl = (NSURL *)obj;
        AVAsset *asset = [AVAsset assetWithURL:assetUrl];

        CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);

        NSLog(@"%s: Number of tracks: %lu", __PRETTY_FUNCTION__, (unsigned long)[[asset tracks] count]);
        AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;

        NSError *error;
        [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, cliptime)
                                       ofTrack:assetTrack
                                        atTime:time
                                         error:&error];
        if (error) {
            NSLog(@"%s: Error - %@", __PRETTY_FUNCTION__, error.debugDescription);
        }

        AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];

        CGAffineTransform transform = assetTrack.preferredTransform;
        [videoLayerInstruction setTransform:CGAffineTransformConcat(transform, transformflip) atTime:time];

        // the main instruction set - this is wrapping the time
        AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration);
        if (videoLayerInstruction != nil)
            videoCompositionInstruction.layerInstructions = @[videoLayerInstruction];
        [instructions addObject:videoCompositionInstruction];

        // time increment variables
        time = CMTimeAdd(time, cliptime);

        if (CGSizeEqualToSize(size, CGSizeZero)) {
            size = assetTrack.naturalSize;;
        }

    }];

    mutableVideoComposition.instructions = instructions;

    // set the frame rate to 9fps
    mutableVideoComposition.frameDuration = CMTimeMake(1, 12);

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths firstObject];
    int number = arc4random_uniform(10000);
    self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
                                                                      presetName:AVAssetExportPreset1280x720];

    exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
    //Set the output file type
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;

    dispatch_group_t group = dispatch_group_create();

    dispatch_group_enter(group);

    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_group_leave(group);

    }];

    dispatch_group_notify(group, dispatch_get_main_queue(), ^{

        // get the size of the file
        unsigned  long long size= ([[[NSFileManager defaultManager] attributesOfItemAtPath:self.outputFile error:nil] fileSize]);
        NSString *filesize = [NSByteCountFormatter stringFromByteCount:size countStyle:NSByteCountFormatterCountStyleFile];
        NSString *thereturn = [NSString stringWithFormat:@"%@: %@", self.outputFile, filesize];

        NSLog(@"Export File (Final) - %@", self.outputFile);
        completion(thereturn);

    });

}

有什么想法或建议吗?

推荐答案

每个 AVAssetTrack 都有一个 preferredTransform 属性.它包含有关如何旋转和翻译视频以正确显示它的信息,因此您不必猜测.在每层指令中使用每个视频的 preferredTransform.

Each AVAssetTrack has a preferredTransform property. It contains information on how to rotate and translate the video to display it properly so you don't have to guess. Use each video's preferredTransform in each layer instruction.

不要设置videoCompositionTrack.preferredTransform = ..."

Don't set "videoCompositionTrack.preferredTransform = ..."

移除变换斜坡[videoLayerInstruction setTransformRampFromStartTransform:..."

Remove the transform ramp "[videoLayerInstruction setTransformRampFromStartTransform:..."

在枚举中,只需使用:

CGAffineTransform transform = assetTrack.preferredTransform;
[videoLayerInstruction setTransform:transform atTime:time];

我假设您的视频是按照与输出相同的尺寸拍摄的,中间视频的宽度和高度颠倒了.如果不是,则必须添加适当的缩放比例:

I'm assuming your videos are shot with the same dimensions as your output, with the middle video having its width and height reversed. If they are not, you'll have to add the appropriate scaling:

float scaleFactor = ...// i.e. (outputWidth / videoWidth) 
CGAffineTransform scale = CGAffineTransformMakeScale(scaleFactor,scaleFactor)
transform = CGAffineTransformConcat(transform, scale);
[videoLayerInstruction setTransform:transform atTime:time];

编辑:似乎在合成中出现倒置的源视频一开始就是倒置的,但具有身份 CGAffineTransform.此代码用于以正确的方向显示它们:

EDIT: It appears that the source videos that appeared upside down in the composition were upside down to begin with, but had an identity CGAffineTransform. This code worked to show them in the correct orientation:

- (void)mergeVideos2:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion {

        AVMutableComposition *mutableComposition = [AVMutableComposition composition];
        AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                           preferredTrackID:kCMPersistentTrackID_Invalid];
        __block NSMutableArray *instructions = [[NSMutableArray alloc] init];
        __block CMTime time = kCMTimeZero;
        __block AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition];
        __block int32_t commontimescale = 600;

        // Create one layer instruction.  We have one video track, and there should be one layer instruction per video track.
        AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];

        [assets enumerateObjectsUsingBlock:^(id  _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {

            NSURL *assetUrl = (NSURL *)obj;
            AVAsset *asset = [AVAsset assetWithURL:assetUrl];

            CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);

            NSLog(@"%s: Number of tracks: %lu", __PRETTY_FUNCTION__, (unsigned long)[[asset tracks] count]);
            AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;
            CGSize naturalSize = assetTrack.naturalSize;

            NSError *error;
            //insert the video from the assetTrack into the composition track
            [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, cliptime)
                                           ofTrack:assetTrack
                                            atTime:time
                                             error:&error];
            if (error) {
                NSLog(@"%s: Error - %@", __PRETTY_FUNCTION__, error.debugDescription);
            }


            CGAffineTransform transform = assetTrack.preferredTransform;

            //set the layer to have this videos transform at the time that this video starts
            if (<* the video is an intermediate video  - has the wrong orientation*>) {
                //these videos have the identity transform, yet they are upside down.
                //we need to rotate them by M_PI radians (180 degrees) and shift the video back into place

                CGAffineTransform rotateTransform = CGAffineTransformMakeRotation(M_PI);
                CGAffineTransform translateTransform = CGAffineTransformMakeTranslation(naturalSize.width, naturalSize.height);
                [videoLayerInstruction setTransform:CGAffineTransformConcat(rotateTransform, translateTransform) atTime:time];

            } else {
                [videoLayerInstruction setTransform:transform atTime:time];
            }

            // time increment variables
            time = CMTimeAdd(time, cliptime);

        }];

        // the main instruction set - this is wrapping the time
        AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,mutableComposition.duration); //make the instruction last for the entire composition
        videoCompositionInstruction.layerInstructions = @[videoLayerInstruction];
        [instructions addObject:videoCompositionInstruction];
        mutableVideoComposition.instructions = instructions;

        // set the frame rate to 9fps
        mutableVideoComposition.frameDuration = CMTimeMake(1, 12);

        //set the rendersize for the video we're about to write
        mutableVideoComposition.renderSize = CGSizeMake(1280,720);

        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths firstObject];
        int number = arc4random_uniform(10000);
        self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];

        //let the rendersize of the video composition dictate size.  use quality preset here
        AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
                                                                          presetName:AVAssetExportPresetHighestQuality];

        exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
        //Set the output file type
        exporter.outputFileType = AVFileTypeQuickTimeMovie;
        exporter.shouldOptimizeForNetworkUse = YES;
        exporter.videoComposition = mutableVideoComposition;

        dispatch_group_t group = dispatch_group_create();

        dispatch_group_enter(group);

        [exporter exportAsynchronouslyWithCompletionHandler:^{
            dispatch_group_leave(group);

        }];

        dispatch_group_notify(group, dispatch_get_main_queue(), ^{

            // get the size of the file
            unsigned  long long size= ([[[NSFileManager defaultManager] attributesOfItemAtPath:self.outputFile error:nil] fileSize]);
            NSString *filesize = [NSByteCountFormatter stringFromByteCount:size countStyle:NSByteCountFormatterCountStyleFile];
            NSString *thereturn = [NSString stringWithFormat:@"%@: %@", self.outputFile, filesize];

            NSLog(@"Export File (Final) - %@", self.outputFile);
            completion(thereturn);

        });
    }

这篇关于iOS 合并三个视频 - 旋转中心视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆