视频合并在后台iOS中 [英] Video merging in background iOS

查看:283
本文介绍了视频合并在后台iOS中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

任务:将传单图片合并到传单视频中。

Task: Merge flyer image into flyer video.

案例


  • 创建传单[添加表情图片/文字..等]

  • 制作视频

案例1


  • 按后退按钮[用户将转到传单屏幕的应用程序列表],在此期间我们正在合并flyerSnapShoot在flyerVideo.and 它完美地工作

  • 去电话库我们看到更新其中的视频。

Case2


  • 按iPhone主页按钮,我正在执行与上述相同但面临以下错误

  • Press iPhone Home button, I am doing things same as above but facing the following error.

FAIL =错误域= AVFoundationErrorDomain代码= -11800操作无法完成UserInfo = 0x17266d40 {NSLocalizedDescription =操作无法完成,NSUnderlyingError = 0x172b3920操作无法进行T为完成。 (OSStatus错误-16980。),NSLocalizedFailureReason =发生未知错误(-16980)}

代码

- (void)modifyVideo:(NSURL *)src destination:(NSURL *)dest crop:(CGRect)crop
              scale:(CGFloat)scale overlay:(UIImage *)image
         completion:(void (^)(NSInteger, NSError *))callback {

    // Get a pointer to the asset
    AVURLAsset* firstAsset = [AVURLAsset URLAssetWithURL:src options:nil];

    // Make an instance of avmutablecomposition so that we can edit this asset:
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    // Add tracks to this composition
    AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    // Audio track
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    // Image video is always 30 seconds. So we use that unless the background video is smaller.
    CMTime inTime = CMTimeMake( MAX_VIDEO_LENGTH * VIDEOFRAME, VIDEOFRAME );
    if ( CMTimeCompare( firstAsset.duration, inTime ) < 0 ) {
        inTime = firstAsset.duration;
    }

    // Add to the video track.
    NSArray *videos = [firstAsset tracksWithMediaType:AVMediaTypeVideo];
    CGAffineTransform transform;
    if ( videos.count > 0 ) {
        AVAssetTrack *track = [videos objectAtIndex:0];
        [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:track atTime:kCMTimeZero error:nil];
        transform = track.preferredTransform;
        videoTrack.preferredTransform = transform;
    }

    // Add the audio track.
    NSArray *audios = [firstAsset tracksWithMediaType:AVMediaTypeAudio];
    if ( audios.count > 0 ) {
        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:[audios objectAtIndex:0] atTime:kCMTimeZero error:nil];
    }

    NSLog(@"Natural size: %.2f x %.2f", videoTrack.naturalSize.width, videoTrack.naturalSize.height);

    // Set the mix composition size.
    mixComposition.naturalSize = crop.size;

    // Set up the composition parameters.
    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
    videoComposition.frameDuration = CMTimeMake(1, VIDEOFRAME );
    videoComposition.renderSize = crop.size;
    videoComposition.renderScale = 1.0;

    // Pass through parameters for animation.
    AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, inTime);

    // Layer instructions
    AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

    // Set the transform to maintain orientation
    if ( scale != 1.0 ) {
        CGAffineTransform scaleTransform = CGAffineTransformMakeScale( scale, scale);
        CGAffineTransform translateTransform = CGAffineTransformTranslate( CGAffineTransformIdentity,
                                                                          -crop.origin.x,
                                                                          -crop.origin.y);
        transform = CGAffineTransformConcat( transform, scaleTransform );
        transform = CGAffineTransformConcat( transform, translateTransform);
    }

    [passThroughLayer setTransform:transform atTime:kCMTimeZero];

    passThroughInstruction.layerInstructions = @[ passThroughLayer ];
    videoComposition.instructions = @[passThroughInstruction];

    // If an image is given, then put that in the animation.
    if ( image != nil ) {

        // Layer that merges the video and image
        CALayer *parentLayer = [CALayer layer];
        parentLayer.frame = CGRectMake( 0, 0, crop.size.width, crop.size.height);

        // Layer that renders the video.
        CALayer *videoLayer = [CALayer layer];
        videoLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
        [parentLayer addSublayer:videoLayer];

        // Layer that renders flyerly image.
        CALayer *imageLayer = [CALayer layer];
        imageLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
        imageLayer.contents = (id)image.CGImage;
        [imageLayer setMasksToBounds:YES];

        [parentLayer addSublayer:imageLayer];

        // Setup the animation tool
        videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
    }

    // Now export the movie
    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    exportSession.videoComposition = videoComposition;

    // Export the URL
    exportSession.outputURL = dest;
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    exportSession.shouldOptimizeForNetworkUse = YES;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        callback( exportSession.status, exportSession.error );
    }];
}

我从AppDelegate.m调用此函数

I am calling this function from AppDelegate.m

- (void)applicationDidEnterBackground:(UIApplication *)application
{
    bgTask = [application beginBackgroundTaskWithName:@"MyTask" expirationHandler:^{
        // Clean up any unfinished task business by marking where you
        // stopped or ending the task outright.
        [application endBackgroundTask:bgTask];
        bgTask = UIBackgroundTaskInvalid;
    }];

    // Start the long-running task and return immediately.
    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{

        // Do the work associated with the task, preferably in chunks.
         [self goingToBg];

        [application endBackgroundTask:bgTask];
        bgTask = UIBackgroundTaskInvalid;
    });

    NSLog(@"backgroundTimeRemaining: %f", [[UIApplication sharedApplication] backgroundTimeRemaining]);
}


推荐答案

在这个问题上做了很多RND ,没有找到它的解决方案。

Did alot RND on this issue, Didn't found solution for it.

想要分享一些链接,希望它能帮助堆栈社区,如果他们遇到同样的问题[要求]。

Want to share few links hope it will help the stack community if they are in same problem[requirement].

Link1:​​ AVExportSession在后台运行

与问题相关的报价[从Link1上方复制]


可悲的是,由于AVAssetExportSession使用gpu执行某些
的工作,如果你使用的是
AVVideoComposition,它无法在后台运行。

Sadly, since AVAssetExportSession uses the gpu to do some of it's work, it cannot run in the background if you are using an AVVideoComposition.

Link2:​​在后台启动AVAssetExportSession

与问题相关的报价[从Link2上方复制]

您可以在后台启动AVAssetExportSession。 AVFoundation在后台执行工作的唯一限制是
,使用
AVVideoCompositions或AVMutableVideoCompositions。 AVVideoCompositions
正在使用GPU,GPU无法在后台使用

You can start AVAssetExportSession in background. The only limitations in AVFoundation to performing work in the background, are using AVVideoCompositions or AVMutableVideoCompositions. AVVideoCompositions are using the GPU, and the GPU cannot be used in the background

后台任务的Url:

APPLE DEV网址

RAYWENDERLICH网址

堆栈问题

这篇关于视频合并在后台iOS中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆