使用AVFoundation裁剪AVAsset视频 [英] Cropping AVAsset video with AVFoundation

查看:414
本文介绍了使用AVFoundation裁剪AVAsset视频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用 AVCaptureMovieFileOutput 来录制一些视频。我使用 AVLayerVideoGravityResizeAspectFill 显示预览图层,稍微放大。我遇到的问题是最终视频较大,包含在预览期间不适合屏幕的额外图像。



这是预览和结果视频





有没有办法可以指定 CGRect 我想用 AVAssetExportSession 从视频剪切?



编辑----



当我申请 CGAffineTransformScale AVAssetTrack 它放大视频,并使用 AVMutableVideoComposition renderSize 设置为 view.bounds 它会结束。太棒了,还剩下一个问题。视频的宽度不会拉伸到正确的宽度,只会填充黑色。



编辑2 ----
建议的问题/答案是不完整的..



我的一些代码:



在我的中 - (void )captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error method我有这个来裁剪和调整视频大小。

   - (void)flipAndSave:(NSURL *)videoURL withCompletionBlock:(void(^)(NSURL * returnURL))completionBlock 
{
AVURLAsset * firstAsset = [AVURLAsset assetWithURL:videoURL];

// 1 - 创建AVMutableComposition对象。该对象将保存您的AVMutableCompositionTrack实例。
AVMutableComposition * mixComposition = [[AVMutableComposition alloc] init];
// 2 - 视频曲目
AVMutableCompositionTrack * firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,firstAsset.duration)
ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];

// 2.1 - 创建AVMutableVideoCompositionInstruction
AVMutableVideoCompositionInstruction * mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(CMTimeMakeWithSeconds(0,600),firstAsset.duration);

// 2.2 - 为第一个曲目创建一个AVMutableVideoCompositionLayerInstruction
AVMutableVideoCompositionLayerInstruction * firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVAssetTrack * firstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation firstAssetOrientation_ = UIImageOrientationUp;
BOOL isFirstAssetPortrait_ = NO;
CGAffineTransform firstTransform = firstAssetTrack.preferredTransform;
if(firstTransform.a == 0&& firstTransform.b == 1.0&& firstTransform.c == -1.0&& firstTransform.d == 0){
firstAssetOrientation_ = UIImageOrientationRight;
isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 0&& firstTransform.b == -1.0&& firstTransform.c == 1.0&& firstTransform.d == 0){
firstAssetOrientation_ = UIImageOrientationLeft;
isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 1.0&& firstTransform.b == 0&& firstTransform.c == 0&& firstTransform.d == 1.0){
firstAssetOrientation_ = UIImageOrientationUp;

}
if(firstTransform.a == -1.0&& firstTransform.b == 0&& firstTransform.c == 0&& firstTransform.d = = -1.0){
firstAssetOrientation_ = UIImageOrientationDown;
}
// [firstlayerInstruction setTransform:firstAssetTrack.preferredTransform atTime:kCMTimeZero];

// [firstlayerInstruction setCropRectangle:self.view.bounds atTime:kCMTimeZero];





CGFloat scale = [self getScaleFromAsset:firstAssetTrack];

firstTransform = CGAffineTransformScale(firstTransform,scale,scale);

[firstlayerInstruction setTransform:firstTransform atTime:kCMTimeZero];

// 2.4 - 添加指令
mainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction,nil];
AVMutableVideoComposition * mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1,30);

// CGSize videoSize = firstAssetTrack.naturalSize;
CGSize videoSize = self.view.bounds.size;
BOOL isPortrait_ = [self isVideoPortrait:firstAsset];
if(isPortrait_){
videoSize = CGSizeMake(videoSize.height,videoSize.width);
}
NSLog(@%@,NSStringFromCGSize(videoSize));
mainCompositionInst.renderSize = videoSize;




// 3 - 音轨
AVMutableCompositionTrack * AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[AudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,firstAsset.duration)
ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];

// 4 - 获取路径
NSString * outputPath = [[NSString alloc] initWithFormat:@%@%@,NSTemporaryDirectory(),@cutoutput.mov];
NSURL * outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager * manager = [[NSFileManager alloc] init];
if([manager fileExistsAtPath:outputPath])
{
[manager removeItemAtPath:outputPath error:nil];
}
// 5 - 创建导出器
AVAssetExportSession * exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
exporter.outputURL = outputURL;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = mainCompositionInst;
[exporter exportAsynchronouslyWithCompletionHandler:^ {
switch([exporter status])
{
case AVAssetExportSessionStatusFailed:
NSLog(@Export failed:%@:%@ ,[[exporter error] localizedDescription],[exporter error]);
completionBlock(无);

休息;
case AVAssetExportSessionStatusCancelled:

NSLog(@Export canceled);
completionBlock(无);

休息;
默认值:{
NSURL * outputURL = exporter.outputURL;
dispatch_async(dispatch_get_main_queue(),^ {
completionBlock(outputURL);
});

休息;
}
}
}];
}


解决方案

以下是我对你的解释问题:您正在屏幕比例为4:3的设备上捕获视频,因此您的 AVCaptureVideoPreviewLayer 是4:3,但视频输入设备以16:9捕获视频所以得到的视频比预览中看到的更大。



如果您只想裁剪预览未捕获的额外像素,请查看 http://www.netwalk.be/article/record-square-video-ios 。本文介绍如何将视频裁剪为正方形。但是,您只需要进行一些修改即可裁剪为4:3。我已经去测试了这个,这里是我所做的更改:



一旦你有了 AVAssetTrack 你需要计算一个新的高度。

  //我们将捕获的高度即1080转换为4:3的屏幕比例并得到新的高度
CGFloat newHeight = clipVideoTrack.naturalSize.height / 3 * 4;

然后使用newHeight修改这两行。

  videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.height,newHeight); 

CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, - (clipVideoTrack.naturalSize.width - newHeight)/ 2);

所以我们在这里做的是将renderSize设置为4:3的比例 - 确切的尺寸基于输入设备。然后我们使用 CGAffineTransform 来翻译视频位置,以便我们在 AVCaptureVideoPreviewLayer 中看到的是我们呈现的内容文件。



编辑:如果您想将它们全部放在一起并根据设备的屏幕比例裁剪视频(3:2,4 :3,16:9)并考虑视频方向我们需要添加一些东西。



首先,修改后的示例代码包含一些重要的修改:

  //输出文件
NSString * docFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES)lastObject];
NSString * outputPath = [docFolder stringByAppendingPathComponent:@output2.mov];
if([[NSFileManager defaultManager] fileExistsAtPath:outputPath])
[[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];

//输入文件
AVAsset * asset = [AVAsset assetWithURL:outputFileURL];

AVMutableComposition * composition = [AVMutableComposition composition];
[composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

//输入剪辑
AVAssetTrack * videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

//裁剪剪辑与屏幕的比率
UIInterfaceOrientation orientation = [self orientationForTrack:asset];
BOOL isPortrait =(orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationPortraitUpsideDown)?是的:不
CGFloat complimentSize = [self getComplimentSize:videoTrack.naturalSize.height];
CGSize videoSize;

if(isPortrait){
videoSize = CGSizeMake(videoTrack.naturalSize.height,complimentSize);
} else {
videoSize = CGSizeMake(complimentSize,videoTrack.naturalSize.height);
}

AVMutableVideoComposition * videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMake(1,30);

AVMutableVideoCompositionInstruction * instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero,CMTimeMakeWithSeconds(60,30));

//旋转和定位视频
AVMutableVideoCompositionLayerInstruction * transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

CGFloat tx =(videoTrack.naturalSize.width-complimentSize)/ 2;
if(orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationLandscapeRight){
//反向翻译
tx * = -1;
}

// t1:旋转和定位视频,因为它可能被裁剪为屏幕比例
CGAffineTransform t1 = CGAffineTransformTranslate(videoTrack.preferredTransform,tx,0);
// t2 / t3:水平镜像视频
CGAffineTransform t2 = CGAffineTransformTranslate(t1,isPortrait?0:videoTrack.naturalSize.width,isPortrait?videoTrack.naturalSize.height:0);
CGAffineTransform t3 = CGAffineTransformScale(t2,isPortrait?1:-1,isPortrait?-1:1);

[变换器setTransform:t3 atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject:instruction];

// export
exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality];
exporter.videoComposition = videoComposition;
exporter.outputURL = [NSURL fileURLWithPath:outputPath];
exporter.outputFileType = AVFileTypeQuickTimeMovie;

[exporter exportAsynchronouslyWithCompletionHandler:^(void){
NSLog(@Exporting done!);

//添加导出到库以测试
ALAssetsLibrary * library = [[ALAssetsLibrary alloc] init];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]]){
[library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]
completionBlock:^(NSURL * assetURL,NSError * error){
NSLog(@保存到专辑);
if(error){

}
}];
}
}];

我们在这里添加的是根据裁剪尺寸获取视频新渲染大小的调用到屏幕比例。一旦我们缩小尺寸,我们需要转换位置以重新定位视频。所以我们抓住它的方向将它移向正确的方向。这将解决我们在 UIInterfaceOrientationLandscapeLeft 中看到的偏心问题。最后 CGAffineTransform t2,t3 水平镜像视频。



以下是实现此目的的两种新方法:

   - (CGFloat)getComplimentSize:(CGFloat)size {
CGRect screenRect = [[UIScreen mainScreen] bounds];
CGFloat ratio = screenRect.size.height / screenRect.size.width;

//我们必须调整16:9屏幕的比例
if(ratio == 1.775)ratio = 1.77777777777778;

返回尺寸*比率;
}

- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
NSArray * tracks = [asset tracksWithMediaType:AVMediaTypeVideo];

if([tracks count]> 0){
AVAssetTrack * videoTrack = [tracks objectAtIndex:0];
CGAffineTransform t = videoTrack.preferredTransform;

//肖像
if(ta == 0& tb == 1.0& tc == -1.0&& td == 0){
orientation = UIInterfaceOrientationPortrait;
}
// PortraitUpsideDown
if(ta == 0& tb == -1.0& tc == 1.0&& td == 0){
orientation = UIInterfaceOrientationPortraitUpsideDown;
}
// LandscapeRight
if(ta == 1.0&& tb == 0& tc == 0&& td == 1.0){
orientation = UIInterfaceOrientationLandscapeRight;
}
// LandscapeLeft
if(ta == -1.0&& tb == 0& tc == 0&& td == -1.0){
orientation = UIInterfaceOrientationLandscapeLeft;
}
}
返回方向;
}

这些非常简单。唯一需要注意的是,在 getComplimentSize:方法中,我们必须手动调整比例为16:9,因为iPhone5 +分辨率在数学上与真正的16:9无关。 / p>

I am using AVCaptureMovieFileOutput to record some video. I have the preview layer displayed using AVLayerVideoGravityResizeAspectFill which zooms in slightly. The problem I have is that the final video is larger, containing extra image that didn't fit on the screen during preview.

This is the preview and resulting video

Is there a way I can specify a CGRect that I want to cut from the video using AVAssetExportSession?

EDIT ----

When I apply a CGAffineTransformScale to the AVAssetTrack it zooms into the video, and with the AVMutableVideoComposition renderSize set to view.bounds it crops off the ends. Great, there's just 1 problem left. The width of the video does not stretch to the correct width, it just gets filled with black.

EDIT 2 ---- The suggested question/answer is incomplete..

Some of my code:

In my - (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error method I have this to crop and resize the video.

- (void)flipAndSave:(NSURL *)videoURL withCompletionBlock:(void(^)(NSURL *returnURL))completionBlock
{
    AVURLAsset *firstAsset = [AVURLAsset assetWithURL:videoURL];

    // 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
    // 2 - Video track
    AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
                        ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];

    // 2.1 - Create AVMutableVideoCompositionInstruction
    AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    mainInstruction.timeRange = CMTimeRangeMake(CMTimeMakeWithSeconds(0, 600), firstAsset.duration);

    // 2.2 - Create an AVMutableVideoCompositionLayerInstruction for the first track
    AVMutableVideoCompositionLayerInstruction *firstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
    AVAssetTrack *firstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    UIImageOrientation firstAssetOrientation_  = UIImageOrientationUp;
    BOOL isFirstAssetPortrait_  = NO;
    CGAffineTransform firstTransform = firstAssetTrack.preferredTransform;
    if (firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0) {
        firstAssetOrientation_ = UIImageOrientationRight;
        isFirstAssetPortrait_ = YES;
    }
    if (firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0) {
        firstAssetOrientation_ =  UIImageOrientationLeft;
        isFirstAssetPortrait_ = YES;
    }
    if (firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0) {
        firstAssetOrientation_ =  UIImageOrientationUp;

    }
    if (firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) {
        firstAssetOrientation_ = UIImageOrientationDown;
    }
//    [firstlayerInstruction setTransform:firstAssetTrack.preferredTransform atTime:kCMTimeZero];

//    [firstlayerInstruction setCropRectangle:self.view.bounds atTime:kCMTimeZero];





    CGFloat scale = [self getScaleFromAsset:firstAssetTrack];

    firstTransform = CGAffineTransformScale(firstTransform, scale, scale);

    [firstlayerInstruction setTransform:firstTransform atTime:kCMTimeZero];

    // 2.4 - Add instructions
    mainInstruction.layerInstructions = [NSArray arrayWithObjects:firstlayerInstruction,nil];
    AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
    mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
    mainCompositionInst.frameDuration = CMTimeMake(1, 30);

//    CGSize videoSize = firstAssetTrack.naturalSize;
    CGSize videoSize = self.view.bounds.size;
    BOOL isPortrait_ = [self isVideoPortrait:firstAsset];
    if(isPortrait_) {
        videoSize = CGSizeMake(videoSize.height, videoSize.width);
    }
    NSLog(@"%@", NSStringFromCGSize(videoSize));
    mainCompositionInst.renderSize = videoSize;




    // 3 - Audio track
    AVMutableCompositionTrack *AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    [AudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
                        ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];

    // 4 - Get path
    NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"cutoutput.mov"];
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
    NSFileManager *manager = [[NSFileManager alloc] init];
    if ([manager fileExistsAtPath:outputPath])
    {
        [manager removeItemAtPath:outputPath error:nil];
    }
    // 5 - Create exporter
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:AVAssetExportPresetHighestQuality];
    exporter.outputURL=outputURL;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    exporter.videoComposition = mainCompositionInst;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        switch ([exporter status])
        {
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Export failed: %@ : %@", [[exporter error] localizedDescription], [exporter error]);
                completionBlock(nil);

                break;
            case AVAssetExportSessionStatusCancelled:

                NSLog(@"Export canceled");
                completionBlock(nil);

                break;
            default: {
                NSURL *outputURL = exporter.outputURL;
                dispatch_async(dispatch_get_main_queue(), ^{
                    completionBlock(outputURL);
                });

                break;
            }
        }
    }];
}

解决方案

Here is my interpretation of your question: You are capturing video on a device with a screen ratio of 4:3, thus your AVCaptureVideoPreviewLayer is 4:3, but the video input device captures video in 16:9 so the resulting video is 'larger' than seen in the preview.

If you are simply looking to crop the extra pixels not caught by the preview then check out this http://www.netwalk.be/article/record-square-video-ios. This article shows how to crop the video into a square. However you'll only need a few modifications to crop to 4:3. I've gone and tested this, here are the changes I made:

Once you have the AVAssetTrack for the video you will need to calculate a new height.

// we convert the captured height i.e. 1080 to a 4:3 screen ratio and get the new height
CGFloat newHeight = clipVideoTrack.naturalSize.height/3*4;

Then modify these two lines, using newHeight.

videoComposition.renderSize = CGSizeMake(clipVideoTrack.naturalSize.height, newHeight);

CGAffineTransform t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height, -(clipVideoTrack.naturalSize.width - newHeight)/2 );

So what we've done here is set the renderSize to a 4:3 ratio - the exact dimension are based on the input device. We then use a CGAffineTransform to translate the video position so that what we saw in the AVCaptureVideoPreviewLayer is what is rendered to our file.

Edit: If you want to put it all together and crop a video based on the device's screen ratio (3:2, 4:3, 16:9) and take the video orientation into mind we need to add a few things.

First here is the modified sample code with a few critical alterations:

// output file
NSString* docFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
NSString* outputPath = [docFolder stringByAppendingPathComponent:@"output2.mov"];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputPath])
    [[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];

// input file
AVAsset* asset = [AVAsset assetWithURL:outputFileURL];

AVMutableComposition *composition = [AVMutableComposition composition];
[composition  addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

// input clip
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

// crop clip to screen ratio
UIInterfaceOrientation orientation = [self orientationForTrack:asset];
BOOL isPortrait = (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationPortraitUpsideDown) ? YES: NO;
CGFloat complimentSize = [self getComplimentSize:videoTrack.naturalSize.height];
CGSize videoSize;

if(isPortrait) {
    videoSize = CGSizeMake(videoTrack.naturalSize.height, complimentSize);
} else {
    videoSize = CGSizeMake(complimentSize, videoTrack.naturalSize.height);
}

AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.renderSize = videoSize;
videoComposition.frameDuration = CMTimeMake(1, 30);

AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(60, 30) );

// rotate and position video
AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

CGFloat tx = (videoTrack.naturalSize.width-complimentSize)/2;
if (orientation == UIInterfaceOrientationPortrait || orientation == UIInterfaceOrientationLandscapeRight) {
    // invert translation
    tx *= -1;
}

// t1: rotate and position video since it may have been cropped to screen ratio
CGAffineTransform t1 = CGAffineTransformTranslate(videoTrack.preferredTransform, tx, 0);
// t2/t3: mirror video horizontally
CGAffineTransform t2 = CGAffineTransformTranslate(t1, isPortrait?0:videoTrack.naturalSize.width, isPortrait?videoTrack.naturalSize.height:0);
CGAffineTransform t3 = CGAffineTransformScale(t2, isPortrait?1:-1, isPortrait?-1:1);

[transformer setTransform:t3 atTime:kCMTimeZero];
instruction.layerInstructions = [NSArray arrayWithObject: transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];

// export
exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
exporter.videoComposition = videoComposition;
exporter.outputURL=[NSURL fileURLWithPath:outputPath];
exporter.outputFileType=AVFileTypeQuickTimeMovie;

[exporter exportAsynchronouslyWithCompletionHandler:^(void){
    NSLog(@"Exporting done!");

    // added export to library for testing
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
    if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]]) {
        [library writeVideoAtPathToSavedPhotosAlbum:[NSURL fileURLWithPath:outputPath]
                                    completionBlock:^(NSURL *assetURL, NSError *error) {
             NSLog(@"Saved to album");
             if (error) {

             }
         }];
    }
}];

What we added here is a call to get the new render size of the video based on cropping its dimensions to the screen ratio. Once we crop the size down, we need to translate the position to recenter the video. So we grab its orientation to move it in the proper direction. This will fix the off-center issue we saw with UIInterfaceOrientationLandscapeLeft. Finally CGAffineTransform t2, t3 mirror the video horizontally.

And here are the two new methods that make this happen:

- (CGFloat)getComplimentSize:(CGFloat)size {
    CGRect screenRect = [[UIScreen mainScreen] bounds];
    CGFloat ratio = screenRect.size.height / screenRect.size.width;

    // we have to adjust the ratio for 16:9 screens
    if (ratio == 1.775) ratio = 1.77777777777778;

    return size * ratio;
}

- (UIInterfaceOrientation)orientationForTrack:(AVAsset *)asset {
    UIInterfaceOrientation orientation = UIInterfaceOrientationPortrait;
    NSArray *tracks = [asset tracksWithMediaType:AVMediaTypeVideo];

    if([tracks count] > 0) {
        AVAssetTrack *videoTrack = [tracks objectAtIndex:0];
        CGAffineTransform t = videoTrack.preferredTransform;

        // Portrait
        if(t.a == 0 && t.b == 1.0 && t.c == -1.0 && t.d == 0) {
            orientation = UIInterfaceOrientationPortrait;
        }
        // PortraitUpsideDown
        if(t.a == 0 && t.b == -1.0 && t.c == 1.0 && t.d == 0) {
            orientation = UIInterfaceOrientationPortraitUpsideDown;
        }
        // LandscapeRight
        if(t.a == 1.0 && t.b == 0 && t.c == 0 && t.d == 1.0) {
            orientation = UIInterfaceOrientationLandscapeRight;
        }
        // LandscapeLeft
        if(t.a == -1.0 && t.b == 0 && t.c == 0 && t.d == -1.0) {
            orientation = UIInterfaceOrientationLandscapeLeft;
        }
    }
    return orientation;
}

These are pretty straight forward. The only thing to note is that in the getComplimentSize: method we have to manually adjust the ratio for 16:9 since the iPhone5+ resolution is mathematically shy of true 16:9.

这篇关于使用AVFoundation裁剪AVAsset视频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆