iOS - AVAssestExportSession只能在播放AVPlayer后导出最多8个曲目 [英] iOS - AVAssestExportSession can only export maximum 8 tracks after playing with AVPlayer

查看:147
本文介绍了iOS - AVAssestExportSession只能在播放AVPlayer后导出最多8个曲目的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试循环录制视频的某些片段并将它们合并为一个视频。
我成功地合并并导出了一个最多16首曲目的乐曲。但是当我在合并之前尝试使用 AVPlayer 播放合成时,我最多只能输出8个音轨。

I'm trying to loop some fragments of a recorded video and merge them into one video. I've successfully merged and exported a composition with up to 16 tracks. But when I try to play the composition using AVPlayer before merging, I can only export a maximum of 8 tracks.

首先,我创建 AVComposition AVVideoComposition

    +(void)previewUserClipDanceWithAudio:(NSURL*)videoURL audioURL:(NSURL*)audioFile loop:(NSArray*)loopTime slowMotion:(NSArray*)slowFactor showInViewController:(UIViewController*)viewController completion:(void(^)(BOOL success, AVVideoComposition* videoComposition, AVComposition* composition))completion{

AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
NSMutableArray *arrayInstruction = [[NSMutableArray alloc] init];
AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

AVURLAsset  *audioAsset = [[AVURLAsset alloc]initWithURL:audioFile options:nil];
//NSLog(@"audio File %@",audioFile);

CMTime duration = kCMTimeZero;

AVAsset *currentAsset = [AVAsset assetWithURL:videoURL];
BOOL  isCurrentAssetPortrait  = YES;

for(NSInteger i=0;i< [loopTime count]; i++) {

    //handle looptime array
    NSInteger loopDur = [[loopTime objectAtIndex:i] intValue];
    NSInteger value = labs(loopDur);
    //NSLog(@"loopInfo %d value %d",loopInfo,value);
    //handle slowmotion array
    double slowInfo = [[slowFactor objectAtIndex:i] doubleValue];
    double videoScaleFactor = fabs(slowInfo);

    AVMutableCompositionTrack *currentTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *audioTrack;
    audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                             preferredTrackID:kCMPersistentTrackID_Invalid];
    if (i==0) {
        [currentTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];

        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, currentAsset.duration) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];

    } else {

        [currentTrack insertTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10)) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:duration error:nil];

        if (videoScaleFactor==1) {

            [audioTrack insertTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10)) ofTrack:[[currentAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:duration error:nil];
        }
        //slow motion here
        if (videoScaleFactor!=1) {

            [currentTrack scaleTimeRange:CMTimeRangeMake(CMTimeSubtract(currentAsset.duration, CMTimeMake(value, 10)), CMTimeMake(value, 10))
                              toDuration:CMTimeMake(value*videoScaleFactor, 10)];
            NSLog(@"slowmo %f",value*videoScaleFactor);
        }
    }

    AVMutableVideoCompositionLayerInstruction *currentAssetLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:currentTrack];
    AVAssetTrack *currentAssetTrack = [[currentAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    BOOL  isCurrentAssetPortrait  = YES;
    //CGFloat assetScaleToFitRatio;
    //assetScaleToFitRatio = [self getScaleToFitRatioCurrentTrack:currentTrack];

    if(isCurrentAssetPortrait){
        //NSLog(@"portrait");
        if (slowInfo<0) {
            CGRect screenRect = [[UIScreen mainScreen] bounds];
            CGFloat ratio = screenRect.size.height / screenRect.size.width;

            // we have to adjust the ratio for 16:9 screens
            if (ratio == 1.775) ratio = 1.77777777777778;

            CGFloat complimentSize = (currentAssetTrack.naturalSize.height*ratio);
            CGFloat tx = (currentAssetTrack.naturalSize.width-complimentSize)/2;

            // invert translation because of portrait
            tx *= -1;
            // t1: rotate and position video since it may have been cropped to screen ratio
            CGAffineTransform t1 = CGAffineTransformTranslate(currentAssetTrack.preferredTransform, tx, 0);
            // t2/t3: mirror video vertically

            CGAffineTransform t2 = CGAffineTransformTranslate(t1, currentAssetTrack.naturalSize.width, 0);
            CGAffineTransform t3 = CGAffineTransformScale(t2, -1, 1);

            [currentAssetLayerInstruction setTransform:t3 atTime:duration];

        } else if (loopDur<0) {
            CGRect screenRect = [[UIScreen mainScreen] bounds];
            CGFloat ratio = screenRect.size.height / screenRect.size.width;

            // we have to adjust the ratio for 16:9 screens
            if (ratio == 1.775) ratio = 1.77777777777778;

            CGFloat complimentSize = (currentAssetTrack.naturalSize.height*ratio);
            CGFloat tx = (currentAssetTrack.naturalSize.width-complimentSize)/2;

            // invert translation because of portrait
            tx *= -1;
            // t1: rotate and position video since it may have been cropped to screen ratio
            CGAffineTransform t1 = CGAffineTransformTranslate(currentAssetTrack.preferredTransform, tx, 0);
            // t2/t3: mirror video horizontally
            CGAffineTransform t2 = CGAffineTransformTranslate(t1, 0, currentAssetTrack.naturalSize.height);
            CGAffineTransform t3 = CGAffineTransformScale(t2, 1, -1);

            [currentAssetLayerInstruction setTransform:t3 atTime:duration];

        } else {

            [currentAssetLayerInstruction setTransform:currentAssetTrack.preferredTransform atTime:duration];

        }
    }else{
        //            CGFloat translateAxisX = (currentTrack.naturalSize.width > MAX_WIDTH )?(0.0):0.0;// if use <, 640 video will be moved left by 10px. (float)(MAX_WIDTH - currentTrack.naturalSize.width)/(float)4.0
        //            CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(assetScaleToFitRatio,assetScaleToFitRatio);
        //            [currentAssetLayerInstruction setTransform:
        //             CGAffineTransformConcat(CGAffineTransformConcat(currentAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(translateAxisX, 0)) atTime:duration];
    }
    if (i==0) {
        duration=CMTimeAdd(duration, currentAsset.duration);
    } else  {
        if (videoScaleFactor!=1) {
            duration=CMTimeAdd(duration, CMTimeMake(value*videoScaleFactor, 10));
        } else {
            duration=CMTimeAdd(duration, CMTimeMake(value, 10));
        }
    }

    [currentAssetLayerInstruction setOpacity:0.0 atTime:duration];
    [arrayInstruction addObject:currentAssetLayerInstruction];
}

AVMutableCompositionTrack *AudioBGTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[AudioBGTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:CMTimeSubtract(duration, audioAsset.duration) error:nil];

videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, duration);
videoCompositionInstruction.layerInstructions = arrayInstruction;

CGSize naturalSize;
if(isCurrentAssetPortrait){
    naturalSize = CGSizeMake(MAX_HEIGHT,MAX_WIDTH);//currentAssetTrack.naturalSize.height,currentAssetTrack.naturalSize.width);
} else {
    naturalSize = CGSizeMake(MAX_WIDTH,MAX_HEIGHT);//currentAssetTrack.naturalSize;
}

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.instructions = [NSArray arrayWithObject:videoCompositionInstruction];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(naturalSize.width,naturalSize.height);
NSLog(@"prepared");

AVVideoComposition *composition = [videoComposition copy];
AVComposition *mixedComposition = [mixComposition copy];
completion(YES, composition, mixedComposition);
}

然后,我设置 AVPlayer

    -(void)playVideoWithComposition:(AVVideoComposition*)videoComposition inMutableComposition:(AVComposition*)composition{

MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:self.view animated:YES];
hud.label.text = myLanguage(@"kMergeClip");

savedComposition = [composition copy];
savedVideoComposition = [videoComposition copy];
playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;

[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(repeatVideo:) name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];

if (!player) {
    player = [AVPlayer playerWithPlayerItem:playerItem];
    layer = [AVPlayerLayer playerLayerWithPlayer:player];
    layer.frame = [UIScreen mainScreen].bounds;
    [self.ibPlayerView.layer insertSublayer:layer atIndex:0];
    NSLog(@"create new player");
}

if (player.currentItem != playerItem ) {
    [player replaceCurrentItemWithPlayerItem:playerItem];
}
player.actionAtItemEnd = AVPlayerActionAtItemEndNone;
//[player seekToTime:kCMTimeZero];

[playerItem addObserver:self
             forKeyPath:@"status"
                options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                context:@"AVPlayerStatus"];
}

当用户预览他们想要的所有视频并点击保存。我使用这种方法导出

When user previews all the video they want and hit save. I use this method to export

    +(void)mergeUserCLip:(AVVideoComposition*)videoComposition withAsset:(AVComposition*)mixComposition showInViewController:(UIViewController*)viewController completion:(void(^)(BOOL success, NSURL *fileURL))completion{

MBProgressHUD *hud = [MBProgressHUD showHUDAddedTo:viewController.view animated:YES];
hud.mode = MBProgressHUDModeDeterminateHorizontalBar;
hud.label.text = myLanguage(@"kMergeClip");

//Name merge clip using beat name
//NSString* beatName = [[[NSString stringWithFormat:@"%@",audioFile] lastPathComponent] stringByDeletingPathExtension];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *tmpDir = [[documentsDirectory stringByDeletingLastPathComponent] stringByAppendingPathComponent:@"tmp"];
NSString *myPathDocs =  [tmpDir stringByAppendingPathComponent:[NSString stringWithFormat:@"merge-beat.mp4"]];
//Not remove here, will remove when call previewPlayVC
[[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:nil];

// 1 - set up the overlay
CALayer *overlayLayer = [CALayer layer];
UIImage *overlayImage = [UIImage imageNamed:@"watermark.png"];

[overlayLayer setContents:(id)[overlayImage CGImage]];
overlayLayer.frame = CGRectMake(720-221, 1280-109, 181, 69);
[overlayLayer setMasksToBounds:YES];

//    aLayer  = [CALayer layer];
//    [aLayer addSublayer:labelLogo.layer];
//    aLayer.frame = CGRectMake(MAX_WIDTH- labelLogo.width - 10.0, MAX_HEIGHT-50.0, 20.0, 20.0);
//    aLayer.opacity = 1;

// 2 - set up the parent layer
CALayer *parentLayer = [CALayer layer];
CALayer *videoLayer = [CALayer layer];
parentLayer.frame = CGRectMake(0, 0, MAX_HEIGHT,MAX_WIDTH);
videoLayer.frame = CGRectMake(0, 0, MAX_HEIGHT,MAX_WIDTH);
[parentLayer addSublayer:videoLayer];
[parentLayer addSublayer:overlayLayer];

// 3 - apply magic
AVMutableVideoComposition *mutableVideoComposition = [videoComposition copy];
mutableVideoComposition.animationTool = [AVVideoCompositionCoreAnimationTool
                                  videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

NSURL *url = [NSURL fileURLWithPath:myPathDocs];
myLog(@"Path: %@", myPathDocs);
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeMPEG4;
exporter.videoComposition = mutableVideoComposition;
exporter.shouldOptimizeForNetworkUse = NO;

[exporter exportAsynchronouslyWithCompletionHandler:^ {
    //NSLog(@"exporting");
    switch (exporter.status) {
        case AVAssetExportSessionStatusCompleted: {
            NSURL *url = [NSURL fileURLWithPath:myPathDocs];
            hud.progress = 1.0f;
            dispatch_async(dispatch_get_main_queue(), ^{
                [MBProgressHUD hideHUDForView:viewController.view animated:YES];
            });
            [self checkTmpSize];
            if (completion) {
                completion(YES, url);
            }
        }
            break;
        case AVAssetExportSessionStatusExporting:
            myLog(@"Exporting!");
            break;
        case AVAssetExportSessionStatusWaiting:
            myLog(@"Waiting");
            break;
        default:
            break;
    }
}];
}

如果选择循环次数少于8次,则上述代码可以正常工作。
如果选择选项超过8次,导出会话冻结显示export.progress = 0.0000000
如果我删除此行

If select options to loop less than 8 times, the above code works fine. If select options more than 8 times, export session freezes showing export.progress = 0.0000000 If I remove this line

    playerItem.videoComposition = videoComposition;

然后我无法预览混合视频,但能够正常导出(最多16个曲目)。

Then I cannot preview the mixed video but enable to export normally (up to 16 tracks).

或者如果我删除导出代码中的行:

Or If I remove the line in export code:

    exporter.videoComposition = mutableVideoComposition;

然后可以预览混合视频,并在没有视频合成的情况下正常导出。

Then it's possible to preview the mixed video, and export normally WITHOUT video composition.

所以我认为 AVVideoComposition 和/或我实现它的方式有问题。

So I guess there's something wrong with AVVideoComposition and/or the way I implement it.

我将不胜感激任何建议。
非常感谢。

I would appreciate any suggestion. Many thanks.

我非常怀疑这是因为使用 AVPlayer 以某种方式预览视频 AVAssetExportSession 如下文所述:

I highly doubt the reason for this is using AVPlayer to preview video somehow hinders AVAssetExportSession as described in below posts:

iOS 5:使用AVAssetExportSession合并3个视频时出错

AVPlayerItem因AVStatusFailed失败而错误代码无法解码

推荐答案

我在尝试连接N个视频时遇到了这个问题,同时播放了3个视频我 AVPlayer UICollectionView 中的实例。已在您链接的Stack Overflow 问题中进行了讨论iOS只能处理 AVPlayer 这么多的实例。每个实例都使用渲染管道。我发现每个 AVMutableCompositionTrack 的实例也会使用其中一个渲染管道。

I ran into this issue while attempting to concatenate N videos while playing up to 3 videos I AVPlayer instances in an UICollectionView. It's been discussed in the Stack Overflow question you linked that iOS is capable of only handling so many instances of AVPlayer. Each instance uses up a "render pipeline". I discovered that each instance of AVMutableCompositionTrack also uses up one of these render pipelines.

因此,如果你使用太多 AVPlayer 实例或尝试使用太多 AVMutableCompositionTrack AVMutableComposition >轨道,你可以用尽资源来解码H264,你将收到无法解码错误。我只能使用两个 AVMutableCompositionTrack 的实例来解决这个问题。这样我就可以重叠视频片段,同时还应用转换(这需要两个视频轨道同时播放)。

Therefore if you use too many AVPlayer instances or try to create an AVMutableComposition with too many AVMutableCompositionTrack tracks, you can run out of resources to decode H264 and you will receive the "Cannot Decode" error. I was able to get around the issue by only using two instances of AVMutableCompositionTrack. This way I could "overlap" segments of video while also applying transitions (which requires two video tracks to "play" concurrently).

简而言之:最小化你的使用 AVMutableCompositionTrack 以及 AVPlayer 。您可以查看Apple的 AVCustomEdit 示例代码举个例子。具体来说,请查看 APLSimpleEditor 类中的 buildTransitionComposition 方法。

In short: minimize your usage of AVMutableCompositionTrack as well as AVPlayer. You can check out the AVCustomEdit sample code by Apple for an example of this. Specifically, check out the buildTransitionComposition method inside the APLSimpleEditor class.

这篇关于iOS - AVAssestExportSession只能在播放AVPlayer后导出最多8个曲目的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆