是否可以将两个视频文件合并为一个文件,在iOS中为一个屏幕? [英] Is it possible to merge two video files to one file, one screen in iOS?

查看:91
本文介绍了是否可以将两个视频文件合并为一个文件,在iOS中为一个屏幕?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是视频编程新手.我正在尝试锻炼它,但是遇到了麻烦,它将两个视频文件合并为一个.

I'm new to video programming. I'm trying to exercise it but I'm having trouble, which merges two video files to one.

我的意思是合并如下.

我有第一个这样的视频

第二个视频也这样

我希望他们这样合并

我不想使用2个视频播放器,因为我想将合并的视频文件发送给某人.我整日搜寻以解决这个问题,但找不到解决方法.

I didn't want to use 2 video players because I want to send the merged video file to someone. I searched all day to solve this, but I could't find how to.

我写了引用

I wrote code referencing this link but it shows first video only, not merged.

我的代码:

NSURL *firstURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"video1" ofType:@"mp4"]]
AVURLAsset  *firstAsset = [[AVURLAsset alloc]initWithURL:firstURL options:nil];

NSURL *secondURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"video2" ofType:@"mp4"]];
VURLAsset  *secondAsset = [[AVURLAsset alloc]initWithURL:secondURL options:nil];

AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                  preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
                    ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                     atTime:kCMTimeZero error:nil];

AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];

[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration)
                     ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                      atTime:kCMTimeZero error:nil];

[secondTrack setPreferredTransform:CGAffineTransformMakeScale(0.25f,0.25f)];

NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"FinalVideo.mov"]];

NSLog(@"%@", outputFilePath);

NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
    [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];


AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
assetExport.outputFileType = @"com.apple.quicktime-movie";
assetExport.outputURL = outputFileUrl;

[assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) {

     switch (assetExport.status) {
         case AVAssetExportSessionStatusFailed:
             NSLog(@"AVAssetExportSessionStatusFailed");
             break;
         case AVAssetExportSessionStatusCompleted:
             NSLog(@"AVAssetExportSessionStatusCompleted");
             break;
         case AVAssetExportSessionStatusWaiting:
             NSLog(@"AVAssetExportSessionStatusWaiting");
             break;
         default:
             break;
     }
 }
 ];

我想念什么?我不知道该如何解决这个问题.

What am I missing? I don't know how I can approach this to solve the problem.

赞赏任何想法. 谢谢.

Appreciate any ideas. Thanks.

我写了一个新代码,引用了马特写的链接,谢谢马特.但是当我尝试导出它时,仅导出了第一个视频.不在一起..:(

i made a new code which referenced a link matt wrote, thanks matt. but when i tried to export it, only first video was exported. not together.. :(

我的新代码是..

NSURL *originalVideoURL1 = [[NSBundle mainBundle] URLForResource:@"video1" withExtension:@"mov"];
NSURL *originalVideoURL2 = [[NSBundle mainBundle] URLForResource:@"video2" withExtension:@"mov"];


AVURLAsset *firstAsset = [AVURLAsset URLAssetWithURL:originalVideoURL1 options:nil];
AVURLAsset *secondAsset = [AVURLAsset URLAssetWithURL:originalVideoURL2 options:nil];

AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init]; //[AVMutableComposition composition];

NSError *error = nil;
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];

if(error) {
    NSLog(@"firstTrack error!!!. %@", error.localizedDescription);
}

AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];

if(error) {
    NSLog(@"secondTrack error!!!. %@", error.localizedDescription);
}


AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);

AVMutableVideoCompositionLayerInstruction *firstLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
CGAffineTransform scale = CGAffineTransformMakeScale(0.7, 0.7);
CGAffineTransform move = CGAffineTransformMakeTranslation(230, 230);
[firstLayerInstruction setTransform:CGAffineTransformConcat(scale, move) atTime:kCMTimeZero];

AVMutableVideoCompositionLayerInstruction *secondLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
CGAffineTransform secondScale = CGAffineTransformMakeScale(1.2, 1.5);
CGAffineTransform secondMove = CGAffineTransformMakeTranslation(0, 0);
[secondLayerInstruction setTransform:CGAffineTransformConcat(secondScale, secondMove) atTime:kCMTimeZero];

mainInstruction.layerInstructions = @[firstLayerInstruction, secondLayerInstruction];

AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = @[mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = CGSizeMake(640, 480);

AVPlayerItem *newPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
newPlayerItem.videoComposition = mainCompositionInst;

AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:newPlayerItem];

AVPlayerLayer *playerLayer =[AVPlayerLayer playerLayerWithPlayer:player];

[playerLayer setFrame:self.view.bounds];
[self.view.layer addSublayer:playerLayer];
[player seekToTime:kCMTimeZero];
[player play]; // play is Good!!


NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];

NSString *tempS2 = [documentsDirectory stringByAppendingPathComponent:@"FinalVideo.mov"];

if([[NSFileManager defaultManager] fileExistsAtPath:tempS2])
{
    [[NSFileManager defaultManager] removeItemAtPath:tempS2 error:nil];
}


NSURL *url = [[NSURL alloc] initFileURLWithPath: tempS2];

AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
                                       initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];

exportSession.outputURL=url;

NSLog(@"%@", [exportSession supportedFileTypes]);

exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
    if (exportSession.status==AVAssetExportSessionStatusFailed) {
        NSLog(@"failed");
    }
    else {
        NSLog(@"AudioLocation : %@",tempS2);
    }
}];

我如何导出我的mixComposition和layerInstruction两者?

how can i export my mixComposition and layerInstruction both?

请给我更多一些想法.

谢谢.

推荐答案

参考第二次编辑中的代码,就像您已经将AVMutableVideoComposition告诉了AVPlayerItem一样,您还需要告诉AVAssetExportSession也是:

With reference to the code in your second edit, just as you've told the AVPlayerItem about your AVMutableVideoComposition, you need to also tell the AVAssetExportSession too:

exportSession.videoComposition = mainCompositionInst;
// exportAsynchronouslyWithCompletionHandler etc

不适用:请确保在设置指令时长时选择两个音轨时长中的较长者:

N.B. make sure you choose the longer of the two track durations when setting your instruction duration:

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMaximum(firstAsset.duration, secondAsset.duration));

AVPlayer不会介意是否弄错了,但是AVAssetExportSession会并且会返回AVErrorInvalidVideoComposition(-11841)错误.

AVPlayer doesn't mind if you get this wrong, but AVAssetExportSession does and will return an AVErrorInvalidVideoComposition (-11841) error.

N.B. 2 您的AVPlayer实际上并没有超出范围,但是当我看着它时,让我感到紧张.如果您是我,我会将其分配给一个物业.

N.B. 2 Your AVPlayer isn't actually going out of scope, but it makes me nervous when I look at it. I'd assign it to a property if I were you.

这篇关于是否可以将两个视频文件合并为一个文件,在iOS中为一个屏幕?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆