iOS的AVFoundation出口会话丢失的音频 [英] iOS AVFoundation Export Session is missing audio

查看:111
本文介绍了iOS的AVFoundation出口会话丢失的音频的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我现在用的是iOS的AVFoundation框架,我能够成功合并视频轨道,具有图像覆盖和文字覆盖。然而,我的输出文件不保持完好的声音从我原来的源视频。

我怎样才能确保从我的影片之一音源保持与新的视频创建?

修改

*使用此code有如何做到这一点创建一个视频(与原始音频)一个很好的例子。这不是明显,我认为我需要seperatly包括音轨处理与AVFoundation视频时。希望这可以帮助别人。

  AVAssetTrack * videoTrack =零;
    AVAssetTrack * audioTrack =零;
    CMTime insertionPoint = kCMTimeZero;    如果([[URL tracksWithMediaType:AVMediaTypeVideo]计数]!= 0){
        videoTrack = [URL tracksWithMediaType:AVMediaTypeVideo] [0];
    }    如果([[URL tracksWithMediaType:AVMediaTypeAudio]计数]!= 0){
        audioTrack = [URL tracksWithMediaType:AVMediaTypeAudio] [0];
    }    //插入来自AVAsset的视频和音频轨道
    如果(videoTrack!=无){
        AVMutableCompositionTrack * compositionVideoTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,[URL持续时间])ofTrack:videoTrack atTime:insertionPoint错误:&放大器;错误]
    }
    如果(audioTrack!=无){
        AVMutableCompositionTrack * compositionAudioTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,[URL持续时间])ofTrack:audioTrack atTime:insertionPoint错误:&放大器;错误]
    }


解决方案

下面是完整的code这解决了这个,它有whcih相结合,与他们的音频两段视频: -

  AVURLAsset *视频1 = [[AVURLAsset页头] initWithURL:[NSURL fileURLWithPath:路径1]选项:无];AVURLAsset * VIDEO2 = [[AVURLAsset页头] initWithURL:[NSURL fileURLWithPath:PATH2]选项:无];如果(VIDEO1 =零和放大器;!&安培;!VIDEO2 =无){    // 1  - 创建AVMutableComposition对象。这个对象将握住你的AVMutableCompositionTrack实例。
    AVMutableComposition * mixComposition = [[AVMutableComposition的alloc]初始化];
    // 2 - 视频轨    AVMutableCompositionTrack * firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack * firstTrackAudio = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,video1.duration)
                        ofTrack:[[视频1 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero错误:无];
    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,video2.duration)
                        ofTrack:[[视频2 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:video1.duration错误:无];

//它有一个音频轨迹

 如果([VIDEO1 tracksWithMediaType:AVMediaTypeAudio]算] 0)
    {
        AVAssetTrack * clipAudioTrack = [[VIDEO1 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [firstTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero,video1.duration)ofTrack:clipAudioTrack atTime:kCMTimeZero错误:无];
    }

//它有一个音频轨迹

 如果([视频2 tracksWithMediaType:AVMediaTypeAudio]算] 0)
    {
        AVAssetTrack * clipAudioTrack = [[VIDEO2 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [firstTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero,video2.duration)ofTrack:clipAudioTrack atTime:video1.duration错误:无];
    }

//导出会话

  AVAssetExportSession *出口= [[AVAssetExportSession页头] initWithAsset:mixComposition
                                                                      presetName:AVAssetExport presetHighestQuality]。    //创建路径导出 - 保存到临时目录
    * NSString的文件名= [的NSString stringWithFormat:@视频_%d.mov,arc4random()%1000];
    * NSString的路径= [NSTemporaryDirectory()stringByAppendingPathComponent:文件名]。    //检查是否已经存在在输出URL的文件。
    如果([的NSFileManager defaultManager] fileExistsAtPath:路径])
    {
        的NSLog(@的路径删除项目:%@,路径);
        [的NSFileManager defaultManager] removeItemAtPath:路径错误:无];
    }    exporter.outputURL = [NSURL fileURLWithPath:路径];
    //设置输出文件类型
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    path3时=路径;
    [arr_StoredDocumentoryUrls ADDOBJECT:path3时];    //出口!
    [出口exportAsynchronouslyWithCompletionHandler:^ {
        开关(exporter.status){
            案例AVAssetExportSessionStatusCompleted:{
                的NSLog(@导出完成);                打破;
            }
            案例AVAssetExportSessionStatusFailed:
                的NSLog(@导出错误:%@,[exporter.error描述]);
                打破;
            案例AVAssetExportSessionStatusCancelled:
                的NSLog(@出口已取消);
                打破;
            默认:
                打破;
        }
    }];}

I'm am using the iOS AVFoundation framework and I am able to successfully merge video tracks, with image overlays, and text overlays. However, my output file doesn't keep the audio intact from my original source video.

How can I make sure that the audio source from one of my videos stays with the new video I create?

EDIT

*Use this code to have a good example of how to accomplish this creating a video (with original audio). It was not obvious to me that I need to include the audio track seperatly when processing a video with AVFoundation. Hope this helps somebody else.

    AVAssetTrack *videoTrack = nil;
    AVAssetTrack *audioTrack = nil;
    CMTime insertionPoint = kCMTimeZero;

    if([[url tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
        videoTrack = [url tracksWithMediaType:AVMediaTypeVideo][0];
    }

    if([[url tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
        audioTrack = [url tracksWithMediaType:AVMediaTypeAudio][0];
    }

    // Insert the video and audio tracks from AVAsset
    if (videoTrack != nil) {
        AVMutableCompositionTrack *compositionVideoTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:videoTrack atTime:insertionPoint error:&error];
    }
    if (audioTrack != nil) {
        AVMutableCompositionTrack *compositionAudioTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:audioTrack atTime:insertionPoint error:&error];
    }

解决方案

Here is the complete code which solved this, it has two videos whcih are combined with their audios:-

AVURLAsset* video1 = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:path1] options:nil];

AVURLAsset* video2 = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:path2] options:nil];

if (video1 !=nil && video2!=nil) {

    // 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
    // 2 - Video track

    AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *firstTrackAudio = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];

    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1.duration)
                        ofTrack:[[video1 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video2.duration)
                        ofTrack:[[video2 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:video1.duration error:nil];

// it has an audio track

    if ([[video1 tracksWithMediaType:AVMediaTypeAudio] count] > 0)
    {
        AVAssetTrack *clipAudioTrack = [[video1 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [firstTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
    }

// it has an audio track

    if ([[video2 tracksWithMediaType:AVMediaTypeAudio] count] > 0)
    {
        AVAssetTrack *clipAudioTrack = [[video2 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [firstTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, video2.duration) ofTrack:clipAudioTrack atTime:video1.duration error:nil];
    }

// export session

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:AVAssetExportPresetHighestQuality];

    //Creates the path to export to  - Saving to temporary directory
    NSString* filename = [NSString stringWithFormat:@"Video_%d.mov",arc4random() % 1000];
    NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];

    //Checks if there is already a file at the output URL.  
    if ([[NSFileManager defaultManager] fileExistsAtPath:path])
    {
        NSLog(@"Removing item at path: %@", path);
        [[NSFileManager defaultManager] removeItemAtPath:path error:nil];
    }

    exporter.outputURL = [NSURL fileURLWithPath:path];
    //Set the output file type
    exporter.outputFileType = AVFileTypeQuickTimeMovie;


    path3=path;
    [arr_StoredDocumentoryUrls addObject:path3];

    //Exports!
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        switch (exporter.status) {
            case AVAssetExportSessionStatusCompleted:{
                NSLog(@"Export Complete");

                break;
            }
            case AVAssetExportSessionStatusFailed:
                NSLog(@"Export Error: %@", [exporter.error description]);
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"Export Cancelled");
                break;
            default:
                break;
        }
    }];

}

这篇关于iOS的AVFoundation出口会话丢失的音频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆