错误域= AVFoundationErrorDomain代码= -11821“无法解码” [英] Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode"

查看:323
本文介绍了错误域= AVFoundationErrorDomain代码= -11821“无法解码”的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在尝试将视频与AVFoundation合并时,我发现了一种奇怪的行为。我很确定我在某个地方犯了一个错误,但是我太盲目了。我的目标只是合并4个视频(之后它们之间会有交叉淡入淡出过渡)。
每次我试图导出视频我都会收到此错误:

 错误域= AVFoundationErrorDomain代码= -11821无法解码UserInfo = 0x7fd94073cc30 {NSLocalizedDescription =无法解码,NSLocalizedFailureReason =媒体数据无法解码。它可能已损坏。} 

最有趣的是,如果我不提供AVAssetExportSession和AVMutableVideoComposition,一切正常!我无法理解我做错了什么。源视频从youtube下载并具有.mp4扩展名。我可以用MPMoviePlayerController播放它们。在检查源代码时,请仔细查看AVMutableVideoComposition。
我在iOS模拟器上的Xcode 6.0.1中测试了这段代码。

  #importVideoStitcher.h
#import< UIKit / UIKit.h>
#import< AVFoundation / AVFoundation.h>
#import< AssetsLibrary / AssetsLibrary.h>

@implementation VideoStitcher
{
VideoStitcherCompletionBlock _completionBlock;
AVMutableComposition * _composition;
AVMutableVideoComposition * _videoComposition;
}

- (instancetype)init
{
self = [super init];
if(self)
{
_composition = [AVMutableComposition composition];
_videoComposition = [AVMutableVideoComposition videoComposition];
}
返回自我;
}

- (void)compileVideoWithAssets:(NSArray *)资产完成:(VideoStitcherCompletionBlock)完成
{
_completionBlock = [完成复制];

if(assets == nil || assets.count< 2)
{
//我们需要至少两个视频来制作针脚,对吗?
NSAssert(NO,@VideoStitcher:assets参数为nil或者没有足够的项目);
}
其他
{
[self composeAssets:assets];
if(_composition!= nil)//如果拼接顺利且没有发现错误
[self exportComposition];
}
}

- (void)composeAssets:(NSArray *)资产
{
AVMutableCompositionTrack * compositionVideoTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];

NSError * compositionError = nil;
CMTime currentTime = kCMTimeZero;
AVAsset * asset = nil;
for(int i =(int)assets.count - 1; i> = 0; i--)//由于某种原因,视频以相反的顺序编译。稍后找到错误。 06.10.14
{
asset = assets [i];
AVAssetTrack * assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
BOOL success = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,assetVideoTrack.timeRange.duration)
ofTrack:assetVideoTrack
atTime:currentTime
error:& compositionError];
if(成功)
{
CMTimeAdd(currentTime,asset.duration);
}
其他
{
NSLog(@VideoStitcher:在插入合成时间范围时出错);
if(compositionError!= nil)
{
NSLog(@%@,compositionError);
_completionBlock(nil,compositionError);
_composition = nil;
返回;
}
}
}

AVMutableVideoCompositionInstruction * videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,_composition.duration);
videoCompositionInstruction.backgroundColor = [[UIColor redColor] CGColor];
_videoComposition.instructions = @ [videoCompositionInstruction];
_videoComposition.renderSize = [self calculateOptimalRenderSizeFromAssets:assets];
_videoComposition.frameDuration = CMTimeMake(1,600);
}

- (void)exportComposition
{
NSArray * paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,YES);
NSString * documentsDirectory = [paths objectAtIndex:0];
NSString * myPathDocs = [documentsDirectory stringByAppendingPathComponent:@testVideo.mov];
NSURL * url = [NSURL fileURLWithPath:myPathDocs];


NSString * filePath = [url path];
NSFileManager * fileManager = [NSFileManager defaultManager];
if([fileManager fileExistsAtPath:filePath]){
NSError * error;
if([fileManager removeItemAtPath:filePath error:& error] == NO){
NSLog(@removeItemAtPath%@ error:%@,filePath,error);
}
}

AVAssetExportSession * exporter = [[AVAssetExportSession alloc] initWithAsset:_composition
presetName:AVAssetExportPreset1280x720];
exporter.outputURL = url;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.shouldOptimizeForNetworkUse = YES;
exporter.videoComposition = _videoComposition;
[exporter exportAsynchronouslyWithCompletionHandler:^ {
[self exportDidFinish:exporter];
}];
}

- (void)exportDidFinish:(AVAssetExportSession *)session
{
NSLog(@%li,session.status);
if(session.status == AVAssetExportSessionStatusCompleted)
{
NSURL * outputURL = session.outputURL;

//调用委托方法的时间,但出于测试目的,我们将视频保存在'照片'app

ALAssetsLibrary * library = [[ALAssetsLibrary alloc] init];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
{
[library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL * assetURL,NSError * error){
if(error == nil) )
{
NSLog(@成功保存视频);
}
其他
{
NSLog(@保存视频失败。\ n%@,错误);
}
}];
}
}
else if(session.status == AVAssetExportSessionStatusFailed)
{
NSLog(@VideoStitcher:exports failed.\\\
%@,session 。错误);
}
}

- (CGSize)calculateOptimalRenderSizeFromAssets:(NSArray *)资产
{
AVAsset * firstAsset = assets [0];
AVAssetTrack * firstAssetVideoTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
CGFloat maxWidth = firstAssetVideoTrack.naturalSize.height;
CGFloat maxHeight = firstAssetVideoTrack.naturalSize.width;

for(AVAsset *资产中的资产)
{
AVAssetTrack * assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
if(assetVideoTrack.naturalSize.width> maxWidth)
maxWidth = assetVideoTrack.naturalSize.width;
if(assetVideoTrack.naturalSize.height> maxHeight)
maxHeight = assetVideoTrack.naturalSize.height;
}

返回CGSizeMake(maxWidth,maxHeight);
}

@end

感谢您的关注。我真的很累,我一直试图找到这个虫子连续四个小时。我现在就去睡觉了。

解决方案

我终于找到了解决方案。错误的描述导致我走错了方向:无法解码。媒体数据无法解码。它可能已损坏。根据此说明,您可能会认为您的视频文件存在问题。我花了5个小时试验格式,调试等。



嗯,答案完全不同!



<我的错误是我忘了CMTimeADD()返回值。我认为它改变了它的第一个参数的值,在代码中你可以看到:

  CMTime currentTime = kCMTimeZero; 
for(int i =(int)assets.count - 1; i> = 0; i--)
{
CMTimeAdd(currentTime,asset.duration); //这里!!我实际上并没有增加价值! currentTime始终是kCMTimeZero
}
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero,_composition.duration); //这就是一切都破裂的地方!

我学到的教训:使用AVFoundation时,请务必检查时间值!这非常重要,否则你会遇到很多错误。


There's a strange behaviour I've found when trying to merge videos with AVFoundation. I'm pretty sure that I've made a mistake somewhere but I'm too blind to see it. My goal is just to merge 4 videos (later there will be crossfade transition between them). Everytime I'm trying to export video I get this error:

Error Domain=AVFoundationErrorDomain Code=-11821 "Cannot Decode" UserInfo=0x7fd94073cc30 {NSLocalizedDescription=Cannot Decode, NSLocalizedFailureReason=The media data could not be decoded. It may be damaged.}

The funniest thing is that if I don't provide AVAssetExportSession with AVMutableVideoComposition, then everything works fine! I can't understand what I'm doing wrong. The source videos are downloaded from youtube and have .mp4 extension. I can play them with MPMoviePlayerController. While checking the source code, please, look carefully at AVMutableVideoComposition. I was testing this code in Xcode 6.0.1 on iOS simulator.

#import "VideoStitcher.h"
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>

@implementation VideoStitcher
{
    VideoStitcherCompletionBlock _completionBlock;
    AVMutableComposition *_composition;
    AVMutableVideoComposition *_videoComposition;
}

- (instancetype)init
{
    self = [super init];
    if (self)
    {
        _composition = [AVMutableComposition composition];
        _videoComposition = [AVMutableVideoComposition videoComposition];
    }
    return self;
}

- (void)compileVideoWithAssets:(NSArray *)assets completion:(VideoStitcherCompletionBlock)completion
{
    _completionBlock = [completion copy];

    if (assets == nil || assets.count < 2)
    {
        // We need at least two video to make a stitch, right?
        NSAssert(NO, @"VideoStitcher: assets parameter is nil or has not enough items in it");
    }
    else
    {
        [self composeAssets:assets];
        if (_composition != nil) // if stitching went good and no errors were found
            [self exportComposition];
    }
}

- (void)composeAssets:(NSArray *)assets
{
    AVMutableCompositionTrack *compositionVideoTrack = [_composition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                     preferredTrackID:kCMPersistentTrackID_Invalid];

    NSError *compositionError = nil;
    CMTime currentTime = kCMTimeZero;
    AVAsset *asset = nil;
    for (int i = (int)assets.count - 1; i >= 0; i--) //For some reason videos are compiled in reverse order. Find the bug later. 06.10.14
    {
        asset = assets[i];
        AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
        BOOL success = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetVideoTrack.timeRange.duration)
                                                      ofTrack:assetVideoTrack
                                                       atTime:currentTime
                                                        error:&compositionError];
        if (success)
        {
            CMTimeAdd(currentTime, asset.duration);
        }
        else
        {
            NSLog(@"VideoStitcher: something went wrong during inserting time range in composition");
            if (compositionError != nil)
            {
                NSLog(@"%@", compositionError);
                _completionBlock(nil, compositionError);
                _composition = nil;
                return;
            }
        }
    }

    AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration);
    videoCompositionInstruction.backgroundColor = [[UIColor redColor] CGColor];
    _videoComposition.instructions = @[videoCompositionInstruction];
    _videoComposition.renderSize = [self calculateOptimalRenderSizeFromAssets:assets];
    _videoComposition.frameDuration = CMTimeMake(1, 600);
}

- (void)exportComposition
{
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:@"testVideo.mov"];
    NSURL *url = [NSURL fileURLWithPath:myPathDocs];


    NSString *filePath = [url path];
    NSFileManager *fileManager = [NSFileManager defaultManager];
    if ([fileManager fileExistsAtPath:filePath]) {
        NSError *error;
        if ([fileManager removeItemAtPath:filePath error:&error] == NO) {
            NSLog(@"removeItemAtPath %@ error:%@", filePath, error);
        }
    }

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:_composition
                                                                      presetName:AVAssetExportPreset1280x720];
    exporter.outputURL = url;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    exporter.videoComposition = _videoComposition;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        [self exportDidFinish:exporter];
    }];
}

- (void)exportDidFinish:(AVAssetExportSession*)session
{
    NSLog(@"%li", session.status);
    if (session.status == AVAssetExportSessionStatusCompleted)
    {
        NSURL *outputURL = session.outputURL;

        // time to call delegate methods, but for testing purposes we save the video in 'photos' app

        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL])
        {
            [library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error){
                if (error == nil)
                {
                    NSLog(@"successfully saved video");
                }
                else
                {
                    NSLog(@"saving video failed.\n%@", error);
                }
            }];
        }
    }
    else if (session.status == AVAssetExportSessionStatusFailed)
    {
        NSLog(@"VideoStitcher: exporting failed.\n%@", session.error);
    }
}

- (CGSize)calculateOptimalRenderSizeFromAssets:(NSArray *)assets
{
    AVAsset *firstAsset = assets[0];
    AVAssetTrack *firstAssetVideoTrack = [[firstAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    CGFloat maxWidth = firstAssetVideoTrack.naturalSize.height;
    CGFloat maxHeight = firstAssetVideoTrack.naturalSize.width;

    for (AVAsset *asset in assets)
    {
        AVAssetTrack *assetVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
        if (assetVideoTrack.naturalSize.width > maxWidth)
            maxWidth = assetVideoTrack.naturalSize.width;
        if (assetVideoTrack.naturalSize.height > maxHeight)
            maxHeight = assetVideoTrack.naturalSize.height;
    }

    return CGSizeMake(maxWidth, maxHeight);
}

@end

Thank you for your attention. I am really tired, I've been trying to find the bug for four hours straight. I'll go to sleep now.

解决方案

I've finally found the solution. The description of error lead me in the wrong direction: "Cannot Decode. The media data could not be decoded. It may be damaged.". From this description you may think that there is something wrong with your video files. I've spent 5 hours experimenting with formats, debugging and etc.

Well, THE ANSWER IS COMPLETELY DIFFERENT!

My mistake was that I forgot that CMTimeADD() returns value. I thought that it changes the value of its first argument, and in the code you can see this:

CMTime currentTime = kCMTimeZero;
for (int i = (int)assets.count - 1; i >= 0; i--)
{
    CMTimeAdd(currentTime, asset.duration); //HERE!! I don't actually increment the value! currentTime is always kCMTimeZero
}
videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, _composition.duration); // And that's where everything breaks!

The lesson that I've learned: When working with AVFoundation always check your time values! It's very important, otherwise you'll get a lot of bugs.

这篇关于错误域= AVFoundationErrorDomain代码= -11821“无法解码”的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆