如何写一个电影的视频和音频使用AVAssetWriter? [英] How to write a movie with video AND audio using AVAssetWriter?
问题描述
我要导出电影 AVAssetWriter
和无法弄清楚如何将同步视频和音频轨道。仅导出视频工作正常,但是当我添加音频最终的影片看起来是这样的:
的首先,我看视频(无音频),那么视频冻结(显示最后一帧图像,直到结束),并在几秒钟后,我听到了声音。的
我尝试了一些事情 CMSampleBufferSetOutput presentationTimeStamp
(减去第一个 CMSampleBufferGet presentationTimeStamp
从电流)音频,但是这一切都没有工作,我不认为这是正确的方向,因为视频和放大器;在源影片的音频应该是同步反正...
我在很短的设置:我创建一个 AVAssetReader
2 AVAssetReaderTrackOutput
(一个用于视频,一个用于音频)并把它们添加到 AVAssetReader
,然后创建一个 AVAssetWriter
2 AVAssetWriterInput
(视讯及音频),并将它们添加到 AVAssetWriter
...我开始这一切了:
[assetReader startReading]
[assetWriter startWriting]
[assetWriter startSessionAtSourceTime:kCMTimeZero];
然后我跑2队列做样品缓冲液的东西:
dispatch_queue_t queueVideo = dispatch_queue_create(assetVideoWriterQueue,NULL);
[assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^
{
而([assetWriterVideoInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer = [assetReaderVideoOutput copyNextSampleBuffer]
如果(sampleBuffer)
{
[assetWriterVideoInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
}其他
{
[assetWriterVideoInput markAsFinished]
dispatch_release(queueVideo);
videoFinished = YES;
打破;
}
}
}];dispatch_queue_t queueAudio = dispatch_queue_create(assetAudioWriterQueue,NULL);
[assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^
{
而([assetWriterAudioInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer = [assetReaderAudioOutput copyNextSampleBuffer]
如果(sampleBuffer)
{
[assetWriterAudioInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
}其他
{
[assetWriterAudioInput markAsFinished]
dispatch_release(queueAudio);
audioFinished = YES;
打破;
}
}
}];
在主循环我等两个队列,直到他们完成:
而(videoFinished&安培;!&安培;!audioFinished)
{
睡眠(1);
}
[assetWriter finishWriting]
此外,我试图在库中生成的文件保存为以下code ...
NSURL * URL = [[NSURL页头] initFileURLWithPath:路径];
ALAssetsLibrary *库= [[ALAssetsLibrary的alloc]初始化];
如果([库videoAtPathIsCompatibleWithSavedPhotosAlbum:URL])
{
[库writeVideoAtPathToSavedPhotosAlbum:URL completionBlock:^(NSURL * assetURL,NSError *误差)
{
如果(错误)
的NSLog(@错误=%@,error.localizedDescription);
其他
的NSLog(@已完成...);
}];
}其他
的NSLog(@的错误,视频不保存...);[库发布]
[URL发布]
...但我得到的错误:
视频/用户/ CB /库/应用程序支持/ iPhone模拟器/ 4.2 /应用/ E9865BF9-D190-4912-9248-66768B1AB635 /文件/ export.mp4
不能保存到存储的照片相册:错误
域名= NSOSStatusErrorDomain code = -12950电影无法播放。
的UserInfo = {0x5e4fb90 = NSLocalizedDescription电影不能
播放。}
块引用>在code没有工作在另一个程序的问题。所以,什么是错的电影...?
解决方案- (无效)mergeAudioVideo
{ * NSString的= videoOutputPath [_ documentsDirectory stringByAppendingPathComponent:@dummy_video.mp4];
* NSString的outputFilePath = [_documentsDirectory stringByAppendingPathComponent:@final_video.mp4];
如果([的NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[的NSFileManager defaultManager] removeItemAtPath:outputFilePath错误:无];
NSURL * outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
* NSString的文件路径= [_documentsDirectory stringByAppendingPathComponent:@newFile.m4a];
AVMutableComposition * mixComposition = [AVMutableComposition组成] NSURL * audio_inputFileUrl = [NSURL fileURLWithPath:文件路径]
NSURL * video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath]; CMTime nextClipStartTime = kCMTimeZero; AVURLAsset * videoAsset = [[AVURLAsset页头] initWithURL:video_inputFileUrl选项:无];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration); AVMutableCompositionTrack * a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime错误:无]; AVURLAsset * audioAsset = [[AVURLAsset页头] initWithURL:audio_inputFileUrl选项:无];
CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero,audioAsset.duration);
AVMutableCompositionTrack * b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime错误:无]; AVAssetExportSession * _assetExport = [[AVAssetExportSession页头] initWithAsset:mixComposition presetName:AVAssetExport presetMediumQuality]。
_assetExport.outputFileType = @com.apple.quicktime-电影;
_assetExport.outputURL = outputFileUrl; [_assetExport exportAsynchronouslyWithCompletionHandler:
^(无效){
如果(_assetExport.status == AVAssetExportSessionStatusCompleted){ //将code此处继续
}
其他{
//在这里写下失败code
}
}
];}您可以使用此code合并音频和视频。
I want to export a movie with
AVAssetWriter
and can't figure out how to include video and audio tracks in sync. Exporting only video works fine, but when I add audio the resulting movie looks like this:First I see the video (without audio), then the video freezes (showing the last image frame until the end) and after some seconds I hear the audio.
I tried some things with
CMSampleBufferSetOutputPresentationTimeStamp
(subtracting the firstCMSampleBufferGetPresentationTimeStamp
from the current) for the audio, but it all didn't work and I don't think it is the right direction, since video & audio in the source movie should be in sync anyway...My setup in short: I create an
AVAssetReader
and 2AVAssetReaderTrackOutput
(one for video, one for audio) and add them to theAVAssetReader
, then I create anAVAssetWriter
and 2AVAssetWriterInput
(video & audio) and add them to theAVAssetWriter
... I start it all up with:[assetReader startReading]; [assetWriter startWriting]; [assetWriter startSessionAtSourceTime:kCMTimeZero];
Then I run 2 queues for doing the sample buffer stuff:
dispatch_queue_t queueVideo=dispatch_queue_create("assetVideoWriterQueue", NULL); [assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^ { while([assetWriterVideoInput isReadyForMoreMediaData]) { CMSampleBufferRef sampleBuffer=[assetReaderVideoOutput copyNextSampleBuffer]; if(sampleBuffer) { [assetWriterVideoInput appendSampleBuffer:sampleBuffer]; CFRelease(sampleBuffer); } else { [assetWriterVideoInput markAsFinished]; dispatch_release(queueVideo); videoFinished=YES; break; } } }]; dispatch_queue_t queueAudio=dispatch_queue_create("assetAudioWriterQueue", NULL); [assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^ { while([assetWriterAudioInput isReadyForMoreMediaData]) { CMSampleBufferRef sampleBuffer=[assetReaderAudioOutput copyNextSampleBuffer]; if(sampleBuffer) { [assetWriterAudioInput appendSampleBuffer:sampleBuffer]; CFRelease(sampleBuffer); } else { [assetWriterAudioInput markAsFinished]; dispatch_release(queueAudio); audioFinished=YES; break; } } }];
In the main loop I wait for both queues until they finish:
while(!videoFinished && !audioFinished) { sleep(1); } [assetWriter finishWriting];
Furthermore I try to save the resulting file in the library with the following code...
NSURL *url=[[NSURL alloc] initFileURLWithPath:path]; ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init]; if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:url]) { [library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error) { if(error) NSLog(@"error=%@",error.localizedDescription); else NSLog(@"completed..."); }]; } else NSLog(@"error, video not saved..."); [library release]; [url release];
...but I get the error:
Video /Users/cb/Library/Application Support/iPhone Simulator/4.2/Applications/E9865BF9-D190-4912-9248-66768B1AB635/Documents/export.mp4 cannot be saved to the saved photos album: Error Domain=NSOSStatusErrorDomain Code=-12950 "Movie could not be played." UserInfo=0x5e4fb90 {NSLocalizedDescription=Movie could not be played.}
The code works without problems in another program. So something is wrong with the movie...?
解决方案-(void)mergeAudioVideo { NSString *videoOutputPath=[_documentsDirectory stringByAppendingPathComponent:@"dummy_video.mp4"]; NSString *outputFilePath = [_documentsDirectory stringByAppendingPathComponent:@"final_video.mp4"]; if ([[NSFileManager defaultManager]fileExistsAtPath:outputFilePath]) [[NSFileManager defaultManager]removeItemAtPath:outputFilePath error:nil]; NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath]; NSString *filePath = [_documentsDirectory stringByAppendingPathComponent:@"newFile.m4a"]; AVMutableComposition* mixComposition = [AVMutableComposition composition]; NSURL *audio_inputFileUrl = [NSURL fileURLWithPath:filePath]; NSURL *video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath]; CMTime nextClipStartTime = kCMTimeZero; AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil]; CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration); AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil]; AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil]; CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration); AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil]; AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetMediumQuality]; _assetExport.outputFileType = @"com.apple.quicktime-movie"; _assetExport.outputURL = outputFileUrl; [_assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) { if (_assetExport.status == AVAssetExportSessionStatusCompleted) { //Write Code Here to Continue } else { //Write Fail Code here } } ]; }
You can use this code to merge audio and video.
这篇关于如何写一个电影的视频和音频使用AVAssetWriter?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!