iOS AVFoundation 导出会话缺少音频 [英] iOS AVFoundation Export Session is missing audio
问题描述
我正在使用 iOS AVFoundation 框架,我能够成功地合并视频轨道、图像叠加和文本叠加.但是,我的输出文件无法保持原始源视频中的音频完整无缺.
I'm am using the iOS AVFoundation framework and I am able to successfully merge video tracks, with image overlays, and text overlays. However, my output file doesn't keep the audio intact from my original source video.
如何确保我的一个视频的音频源与我创建的新视频保持一致?
How can I make sure that the audio source from one of my videos stays with the new video I create?
*使用此代码有一个很好的示例,说明如何创建视频(带有原始音频).在使用 AVFoundation 处理视频时,我需要单独包含音轨对我来说并不明显.希望这对其他人有帮助.
*Use this code to have a good example of how to accomplish this creating a video (with original audio). It was not obvious to me that I need to include the audio track seperatly when processing a video with AVFoundation. Hope this helps somebody else.
AVAssetTrack *videoTrack = nil;
AVAssetTrack *audioTrack = nil;
CMTime insertionPoint = kCMTimeZero;
if([[url tracksWithMediaType:AVMediaTypeVideo] count] != 0) {
videoTrack = [url tracksWithMediaType:AVMediaTypeVideo][0];
}
if([[url tracksWithMediaType:AVMediaTypeAudio] count] != 0) {
audioTrack = [url tracksWithMediaType:AVMediaTypeAudio][0];
}
// Insert the video and audio tracks from AVAsset
if (videoTrack != nil) {
AVMutableCompositionTrack *compositionVideoTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:videoTrack atTime:insertionPoint error:&error];
}
if (audioTrack != nil) {
AVMutableCompositionTrack *compositionAudioTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [url duration]) ofTrack:audioTrack atTime:insertionPoint error:&error];
}
推荐答案
这里是解决这个问题的完整代码,它有两个视频与它们的音频相结合:-
Here is the complete code which solved this, it has two videos whcih are combined with their audios:-
AVURLAsset* video1 = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:path1] options:nil];
AVURLAsset* video2 = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:path2] options:nil];
if (video1 !=nil && video2!=nil) {
// 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
// 2 - Video track
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *firstTrackAudio = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1.duration)
ofTrack:[[video1 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video2.duration)
ofTrack:[[video2 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:video1.duration error:nil];
//它有一个音轨
if ([[video1 tracksWithMediaType:AVMediaTypeAudio] count] > 0)
{
AVAssetTrack *clipAudioTrack = [[video1 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[firstTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1.duration) ofTrack:clipAudioTrack atTime:kCMTimeZero error:nil];
}
//它有一个音轨
if ([[video2 tracksWithMediaType:AVMediaTypeAudio] count] > 0)
{
AVAssetTrack *clipAudioTrack = [[video2 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
[firstTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, video2.duration) ofTrack:clipAudioTrack atTime:video1.duration error:nil];
}
//导出会话
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
//Creates the path to export to - Saving to temporary directory
NSString* filename = [NSString stringWithFormat:@"Video_%d.mov",arc4random() % 1000];
NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
//Checks if there is already a file at the output URL.
if ([[NSFileManager defaultManager] fileExistsAtPath:path])
{
NSLog(@"Removing item at path: %@", path);
[[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}
exporter.outputURL = [NSURL fileURLWithPath:path];
//Set the output file type
exporter.outputFileType = AVFileTypeQuickTimeMovie;
path3=path;
[arr_StoredDocumentoryUrls addObject:path3];
//Exports!
[exporter exportAsynchronouslyWithCompletionHandler:^{
switch (exporter.status) {
case AVAssetExportSessionStatusCompleted:{
NSLog(@"Export Complete");
break;
}
case AVAssetExportSessionStatusFailed:
NSLog(@"Export Error: %@", [exporter.error description]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export Cancelled");
break;
default:
break;
}
}];
}
这篇关于iOS AVFoundation 导出会话缺少音频的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!