如何使用比特率压缩ios中的视频? [英] How can i compress a video in ios using Bit Rate?

查看:670
本文介绍了如何使用比特率压缩ios中的视频?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

如何使用比特率压缩视频?



我尝试使用下面的代码来压缩视频,但它不起作用,因为它给我一个像

******由于未捕获的异常'NSInvalidArgumentException'终止应用程序,原因:'* - [AVAssetReader startReading]在读取后无法再次调用已开始' ****

   - (无效)imagePickerController:(UIImagePickerController *)选取器
didFinishPickingMediaWithInfo:(NSDictionary *)info
{


//处理电影捕获
NSURL * movieURL = [info objectForKey:
UIImagePickerControllerMediaURL ]。

NSData * data = [NSData dataWithContentsOfURL:movieURL];

NSArray * paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
NSString * documentsDirectory = [paths objectAtIndex:0];
NSString * tempPath = [documentsDirectory stringByAppendingFormat:@/ vid1.mp4];

BOOL success = [data writeToFile:tempPath atomically:NO];

if(成功)
{
NSLog(@VIdeo成功编写);
}
其他
{
NSLog(@VIdeo Wrting失败);
}


NSURL * uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory()stringByAppendingPathComponent:@1234] stringByAppendingString:@。mp4]];

//首先压缩电影
[self convertVideoToLowQuailtyWithInputURL:movieURL outputURL:uploadURL];
}




- (无效)convertVideoToLowQuailtyWithInputURL:(NSURL *)inputURL
outputURL:(NSURL *)outputURL
{
//设置视频编写器
AVAsset * videoAsset = [[AVURLAsset alloc] initWithURL:inputURL options:nil];

AVAssetTrack * videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

CGSize videoSize = videoTrack.naturalSize;

NSDictionary * videoWriterCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:1250000],AVVideoAverageBitRateKey,nil];

NSDictionary * videoWriterSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264,AVVideoCodecKey,videoWriterCompressionSettings,AVVideoCompressionPropertiesKey,[NSNumber numberWithFloat:videoSize.width],AVVideoWidthKey,[NSNumber numberWithFloat:videoSize.height],AVVideoHeightKey,nil];

AVAssetWriterInput * videoWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoWriterSettings];

videoWriterInput.expectsMediaDataInRealTime = YES;

videoWriterInput.transform = videoTrack.preferredTransform;

AVAssetWriter * videoWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:nil];

[videoWriter addInput:videoWriterInput];

//设置视频阅读器
NSDictionary * videoReaderSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

AVAssetReaderTrackOutput * videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoReaderSettings];

AVAssetReader * videoReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];

[videoReader addOutput:videoReaderOutput];

//设置音频编写器
AVAssetWriterInput * audioWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeAudio
outputSettings:nil];

audioWriterInput.expectsMediaDataInRealTime = NO;

[videoWriter addInput:audioWriterInput];

//设置音频阅读器
AVAssetTrack * audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

AVAssetReaderOutput * audioReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];

AVAssetReader * audioReader = [AVAssetReader assetReaderWithAsset:videoAsset error:nil];

[audioReader addOutput:audioReaderOutput];

[videoWriter startWriting];

//从视频阅读器开始写作
[videoReader startReading];

[videoWriter startSessionAtSourceTime:kCMTimeZero];

dispatch_queue_t processingQueue = dispatch_queue_create(processingQueue1,NULL);

[videoWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:
^ {

while([videoWriterInput isReadyForMoreMediaData])
{

CMSampleBufferRef sampleBuffer;

if([videoReader status] == AVAssetReaderStatusReading&&
(sampleBuffer = [videoReaderOutput copyNextSampleBuffer]))
{

[videoWriterInput appendSampleBuffer :sampleBuffer];
CFRelease(sampleBuffer);
}

其他
{
[videoWriterInput markAsFinished];

if([videoReader status] == AVAssetReaderStatusCompleted)
{
[audioReader startReading];

[videoWriter startSessionAtSourceTime:kCMTimeZero];

dispatch_queue_t processingQueue = dispatch_queue_create(processingQueue2,NULL);

[audioWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:^ {

while(audioWriterInput.readyForMoreMediaData)
{
CMSampleBufferRef sampleBuffer;

if([audioReader status] == AVAssetReaderStatusReading&&
(sampleBuffer = [audioReaderOutput copyNextSampleBuffer])){

[audioWriterInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
}

其他
{

[audioWriterInput markAsFinished];

if([audioReader status] == AVAssetReaderStatusCompleted)
{
[videoWriter finishWritingWithCompletionHandler:^()
{
NSLog(@输出URl: %@,outputURL);
}];
}
}
}

}
];
}
}
}
}

];


}


解决方案

您可以使用以下参数压缩视频的质量。




  • AVAssetExportPresetLowQuality

  • AVAssetExportPresetMediumQuality

  • AVAssetExportPresetHighestQuality



代码:

   - (void)CompressVideo 
{
if(firstAsset!= nil)
{
//创建AVMutableComposition对象。此对象将保持我们的多个AVMutableCompositionTrack。
AVMutableComposition * mixComposition = [[AVMutableComposition alloc] init];

// http://stackoverflow.com/questions/22715881/merge-video-files-with-their-original-audio-in-ios

//视频TRACK
AVMutableCompositionTrack * firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,firstAsset.duration)ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];

//对于音轨包含
// ============================== ================================================== ============
NSArray * arr = [firstAsset tracksWithMediaType:AVMediaTypeAudio];
AVMutableCompositionTrack * audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,firstAsset.duration)ofTrack:[arr lastObject] atTime:kCMTimeZero error:nil];
// ============================================ ================================================== =

AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,firstAsset.duration);

// FIXING ORIENTATION //
AVMutableVideoCompositionLayerInstruction * FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
AVAssetTrack * FirstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
UIImageOrientation FirstAssetOrientation_ = UIImageOrientationUp;
BOOL isFirstAssetPortrait_ = NO;

CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform;
if(firstTransform.a == 0&& firstTransform.b == 1.0&& firstTransform.c == -1.0&& firstTransform.d == 0){FirstAssetOrientation_ = UIImageOrientationRight; isFirstAssetPortrait_ = YES;}
if(firstTransform.a == 0&& firstTransform.b == -1.0&& firstTransform.c == 1.0&& firstTransform.d == 0){ FirstAssetOrientation_ = UIImageOrientationLeft; isFirstAssetPortrait_ = YES;}
if(firstTransform.a == 1.0&& firstTransform.b == 0&& firstTransform.c == 0&& firstTransform.d == 1.0){FirstAssetOrientation_ = UIImageOrientationUp;}
if(firstTransform.a == -1.0&& firstTransform.b == 0&& firstTransform.c == 0&& firstTransform.d == -1.0){ FirstAssetOrientation_ = UIImageOrientationDown;
}
CGFloat FirstAssetScaleToFitRatio = VideoWidth / FirstAssetTrack.naturalSize.width;

if(isFirstAssetPortrait_)
{
FirstAssetScaleToFitRatio = VideoWidth / FirstAssetTrack.naturalSize.height;
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(FirstAssetTrack.preferredTransform,FirstAssetScaleFactor)atTime:kCMTimeZero];
}
else
{
CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
[FirstlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(FirstAssetTrack.preferredTransform,FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0,160))atTime:kCMTimeZero];
}
[FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];

MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,nil];

AVMutableVideoComposition * MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1,30);
// MainCompositionInst.renderSize = CGSizeMake(VideoWidth,900);
MainCompositionInst.renderSize = CGSizeMake(VideoWidth,[UIScreen mainScreen] .bounds.size.height);

NSArray * paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
NSString * documentsDirectory = [paths objectAtIndex:0];
NSString * myPathDocs = [documentsDirectory stringByAppendingPathComponent:@CompressedVideo.mov];

NSLog(@myPath Docs:%@,myPathDocs);

NSURL * url = [NSURL fileURLWithPath:myPathDocs];

if([[NSFileManager defaultManager] fileExistsAtPath:myPathDocs])
{
NSError * error;
[[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:& error];
}

//电影质量
// =========================== =======================
AVAssetExportSession * exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
// ============================================ ======

exporter.outputURL = url;

//电影类型
// ================================ ==================
exporter.outputFileType = AVFileTypeQuickTimeMovie;
// ============================================ ======
exporter.videoComposition = MainCompositionInst;
exporter.shouldOptimizeForNetworkUse = YES;
[exporter exportAsynchronouslyWithCompletionHandler:^
{
dispatch_async(dispatch_get_main_queue(),^
{
videoUrToUload = url;
[self exportDidFinish:exporter];
});
}];
}
}

- (void)exportDidFinish:(AVAssetExportSession *)session
{
if(session.status == AVAssetExportSessionStatusCompleted)
{
//使用session.url存储URL某处
}
}


How can i compress a video using bit rate ?

I tried below code to compress a video but its not working because it giving me an error like

******Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[AVAssetReader startReading] cannot be called again after reading has already started'****

     - (void) imagePickerController: (UIImagePickerController *) picker
          didFinishPickingMediaWithInfo: (NSDictionary *) info 
     {


        // Handle movie capture
        NSURL *movieURL = [info objectForKey:
                           UIImagePickerControllerMediaURL];

        NSData *data = [NSData dataWithContentsOfURL:movieURL];

        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,       NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths objectAtIndex:0];
        NSString *tempPath = [documentsDirectory stringByAppendingFormat:@"/vid1.mp4"];

        BOOL success = [data writeToFile:tempPath atomically:NO];

        if (success)
        {
                      NSLog(@"VIdeo Successfully written");
        }
        else
        {
                       NSLog(@"VIdeo Wrting failed");
        }


        NSURL *uploadURL = [NSURL fileURLWithPath:[[NSTemporaryDirectory() stringByAppendingPathComponent:@"1234"] stringByAppendingString:@".mp4"]];

        // Compress movie first
        [self convertVideoToLowQuailtyWithInputURL:movieURL outputURL:uploadURL];
    }




 - (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL
                                       outputURL:(NSURL*)outputURL
    {
        //setup video writer
        AVAsset *videoAsset = [[AVURLAsset alloc] initWithURL:inputURL options:nil];

        AVAssetTrack *videoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo]   objectAtIndex:0];

        CGSize videoSize = videoTrack.naturalSize;

        NSDictionary *videoWriterCompressionSettings =  [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:1250000], AVVideoAverageBitRateKey, nil];

        NSDictionary *videoWriterSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey, videoWriterCompressionSettings, AVVideoCompressionPropertiesKey, [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, [NSNumber numberWithFloat:videoSize.height], AVVideoHeightKey, nil];

        AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                                assetWriterInputWithMediaType:AVMediaTypeVideo
                                                outputSettings:videoWriterSettings];

        videoWriterInput.expectsMediaDataInRealTime = YES;

        videoWriterInput.transform = videoTrack.preferredTransform;

        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:nil];

        [videoWriter addInput:videoWriterInput];

        //setup video reader
        NSDictionary *videoReaderSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

        AVAssetReaderTrackOutput *videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:videoReaderSettings];

        AVAssetReader *videoReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:nil];

        [videoReader addOutput:videoReaderOutput];

        //setup audio writer
        AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput
                                                assetWriterInputWithMediaType:AVMediaTypeAudio
                                                outputSettings:nil];

        audioWriterInput.expectsMediaDataInRealTime = NO;

        [videoWriter addInput:audioWriterInput];

        //setup audio reader
        AVAssetTrack* audioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

        AVAssetReaderOutput *audioReaderOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];

        AVAssetReader *audioReader = [AVAssetReader assetReaderWithAsset:videoAsset error:nil];

        [audioReader addOutput:audioReaderOutput];

        [videoWriter startWriting];

        //start writing from video reader
        [videoReader startReading];

        [videoWriter startSessionAtSourceTime:kCMTimeZero];

        dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue1", NULL);

        [videoWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:
         ^{

             while ([videoWriterInput isReadyForMoreMediaData])
             {

                 CMSampleBufferRef sampleBuffer;

                 if ([videoReader status] == AVAssetReaderStatusReading &&
                     (sampleBuffer = [videoReaderOutput copyNextSampleBuffer]))
                 {

                     [videoWriterInput appendSampleBuffer:sampleBuffer];
                     CFRelease(sampleBuffer);
                 }

                 else
                 {
                     [videoWriterInput markAsFinished];

                     if ([videoReader status] == AVAssetReaderStatusCompleted)
                     {
                             [audioReader startReading];

                             [videoWriter startSessionAtSourceTime:kCMTimeZero];

                             dispatch_queue_t processingQueue = dispatch_queue_create("processingQueue2", NULL);

                             [audioWriterInput requestMediaDataWhenReadyOnQueue:processingQueue usingBlock:^{

                                 while (audioWriterInput.readyForMoreMediaData)
                                 {
                                     CMSampleBufferRef sampleBuffer;

                                     if ([audioReader status] == AVAssetReaderStatusReading &&
                                         (sampleBuffer = [audioReaderOutput copyNextSampleBuffer])) {

                                         [audioWriterInput appendSampleBuffer:sampleBuffer];
                                         CFRelease(sampleBuffer);
                                     }

                                     else
                                     {

                                         [audioWriterInput markAsFinished];

                                         if ([audioReader status] == AVAssetReaderStatusCompleted)
                                         {
                                             [videoWriter finishWritingWithCompletionHandler:^()
                                             {
                                                 NSLog(@"Output URl : %@",outputURL);
                                             }];
                                         }
                                     }
                                 }

                             }
                              ];                     
                     }
                 }
             }
         }

         ];


    }

解决方案

You can use the parameters below to compress the video for qualities.

  • AVAssetExportPresetLowQuality
  • AVAssetExportPresetMediumQuality
  • AVAssetExportPresetHighestQuality

Code:

- (void)CompressVideo
{
    if(firstAsset !=nil)
    {
        //Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
        AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];

        //        http://stackoverflow.com/questions/22715881/merge-video-files-with-their-original-audio-in-ios

        //VIDEO TRACK
        AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];

        //For Audio Track inclusion
        //============================================================================================
        NSArray *arr = [firstAsset tracksWithMediaType:AVMediaTypeAudio];
        AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[arr lastObject] atTime:kCMTimeZero error:nil];
        //===============================================================================================

        AVMutableVideoCompositionInstruction *MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);

        //FIXING ORIENTATION//
        AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
        AVAssetTrack *FirstAssetTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
        UIImageOrientation FirstAssetOrientation_  = UIImageOrientationUp;
        BOOL  isFirstAssetPortrait_  = NO;

        CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform;
        if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0)  {FirstAssetOrientation_= UIImageOrientationRight; isFirstAssetPortrait_ = YES;}
        if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0)  {FirstAssetOrientation_ =  UIImageOrientationLeft; isFirstAssetPortrait_ = YES;}
        if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0)   {FirstAssetOrientation_ =  UIImageOrientationUp;}
        if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) {FirstAssetOrientation_ = UIImageOrientationDown;
        }
        CGFloat FirstAssetScaleToFitRatio = VideoWidth/FirstAssetTrack.naturalSize.width;

        if(isFirstAssetPortrait_)
        {
            FirstAssetScaleToFitRatio = VideoWidth/FirstAssetTrack.naturalSize.height;
            CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
            [FirstlayerInstruction setTransform:CGAffineTransformConcat(FirstAssetTrack.preferredTransform, FirstAssetScaleFactor) atTime:kCMTimeZero];
        }
        else
        {
            CGAffineTransform FirstAssetScaleFactor = CGAffineTransformMakeScale(FirstAssetScaleToFitRatio,FirstAssetScaleToFitRatio);
            [FirstlayerInstruction setTransform:CGAffineTransformConcat(CGAffineTransformConcat(FirstAssetTrack.preferredTransform, FirstAssetScaleFactor),CGAffineTransformMakeTranslation(0, 160)) atTime:kCMTimeZero];
        }
        [FirstlayerInstruction setOpacity:0.0 atTime:firstAsset.duration];

        MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction,nil];

        AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
        MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
        MainCompositionInst.frameDuration = CMTimeMake(1, 30);
        //        MainCompositionInst.renderSize = CGSizeMake(VideoWidth, 900);
        MainCompositionInst.renderSize = CGSizeMake(VideoWidth, [UIScreen mainScreen].bounds.size.height);

        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths objectAtIndex:0];
        NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:@"CompressedVideo.mov"];

        NSLog(@"myPath Docs : %@",myPathDocs);

        NSURL *url = [NSURL fileURLWithPath:myPathDocs];

        if ([[NSFileManager defaultManager] fileExistsAtPath:myPathDocs])
        {
            NSError *error;
            [[NSFileManager defaultManager] removeItemAtPath:myPathDocs error:&error];
        }

        //Movie Quality
        //==================================================
        AVAssetExportSession *exporter = [[AVAssetExportSession alloc]       initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality];
        //==================================================

        exporter.outputURL=url;

        //Movie Type
        //==================================================
        exporter.outputFileType = AVFileTypeQuickTimeMovie;
        //==================================================
        exporter.videoComposition = MainCompositionInst;
        exporter.shouldOptimizeForNetworkUse = YES;
        [exporter exportAsynchronouslyWithCompletionHandler:^
         {
             dispatch_async(dispatch_get_main_queue(), ^
                            {
                                videoUrToUload = url;
                                [self exportDidFinish:exporter];
                            });
         }];
    }
}

- (void)exportDidFinish:(AVAssetExportSession *)session
{
    if(session.status == AVAssetExportSessionStatusCompleted)
    {
         //Store URL Somewhere using session.url
    }
}

这篇关于如何使用比特率压缩ios中的视频?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆