使用 AVAsset 制作带有图片数组和歌曲文件的电影文件 [英] Make movie file with picture Array and song file, using AVAsset

查看:35
本文介绍了使用 AVAsset 制作带有图片数组和歌曲文件的电影文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用图片数组和音频文件制作电影文件.为了制作带有图片数组的电影,我使用了 zoul 此处.一切都很完美,我有我的电影和我的照片.然而,当我尝试添加一些音轨时,我遇到了很多问题.为了理解我把我的代码:

I'm trying to make movie file using a picture array and an audio file. To make movie with a picture array i used the big post by zoul here. All is perfect, I have my movie with my picture. However when I try to add some audio tracks I have lot of problems. To understand I put my code :

当我调用这个方法时,图片数组和歌曲文件就准备好了:

When I call this method, picture array and song file are ready :

-(void) writeImagesToMovieAtPath:(NSString *) path withSize:(CGSize) size
{
    NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
    NSArray *dirContents = [[NSFileManager defaultManager] directoryContentsAtPath:documentsDirectoryPath];
    for (NSString *tString in dirContents) {
        if ([tString isEqualToString:@"essai.mp4"]) 
        {
            [[NSFileManager defaultManager]removeItemAtPath:[NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,tString] error:nil];

        }
    }

    NSLog(@"Write Started");

    NSError *error = nil;

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
                                                              error:&error];    
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                   nil];

    AudioChannelLayout channelLayout;
    memset(&channelLayout, 0, sizeof(AudioChannelLayout));
    channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;

    NSDictionary *audioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   [NSNumber numberWithInt:kAudioFormatMPEG4AAC], AVFormatIDKey,
                                   [NSNumber numberWithFloat:44100.0] ,AVSampleRateKey, 
                                   [NSNumber numberWithInt: 1] ,AVNumberOfChannelsKey,
                                   [NSNumber numberWithInt:192000],AVEncoderBitRateKey,
                                   [NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)],AVChannelLayoutKey,
                                   nil];

    AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
                                             assetWriterInputWithMediaType:AVMediaTypeVideo
                                             outputSettings:videoSettings] retain];

    AVAssetWriterInput* audioWriterInput = [[AVAssetWriterInput
                                             assetWriterInputWithMediaType:AVMediaTypeAudio
                                             outputSettings:audioSettings] retain];

    NSURL* fileURL = [[NSBundle mainBundle] URLForResource:@"Big_Voice_1" withExtension:@"caf"];

    NSLog(@"%@",fileURL);
    AVAsset *asset = [[AVURLAsset URLAssetWithURL:fileURL 
                                            options:nil] retain];


    AVAssetReader *audioReader = [[AVAssetReader assetReaderWithAsset:asset error:&error] retain];


    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                     sourcePixelBufferAttributes:nil];


    AVAssetTrack* audioTrack = [asset.tracks objectAtIndex:0]; 

    AVAssetReaderOutput *readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];

    [audioReader addOutput:readerOutput];                                             


    NSParameterAssert(videoWriterInput);
    NSParameterAssert(audioWriterInput);
    NSParameterAssert([videoWriter canAddInput:audioWriterInput]);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
    audioWriterInput.expectsMediaDataInRealTime = NO;
    videoWriterInput.expectsMediaDataInRealTime = YES;
    [videoWriter addInput:audioWriterInput];
    [videoWriter addInput:videoWriterInput];
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];


    //Video encoding

    CVPixelBufferRef buffer = NULL;

    //convert uiimage to CGImage.

    int frameCount = 0;

    for(int i = 0; i<20; i++)
    {
        buffer = [self pixelBufferFromCGImage:[[m_PictArray objectAtIndex:i] CGImage] andSize:size];


        BOOL append_ok = NO;
        int j = 0;
        while (!append_ok && j < 30) 
        {
            if (adaptor.assetWriterInput.readyForMoreMediaData) 
            {
                printf("appending %d attemp %d
", frameCount, j);

                CMTime frameTime = CMTimeMake(frameCount,(int32_t) 10);

                //CVPixelBufferPoolCreatePixelBuffer (kCFAllocatorDefault, adaptor.pixelBufferPool, &buffer);
                append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
                NSParameterAssert(bufferPool != NULL);

                [NSThread sleepForTimeInterval:0.05];
            } 
            else 
            {
                printf("adaptor not ready %d, %d
", frameCount, j);
                [NSThread sleepForTimeInterval:0.1];
            }
            j++;
        }
        if (!append_ok) {
            printf("error appending image %d times %d
", frameCount, j);
        }
        frameCount++;
    }

    //Finish the session:
    [videoWriterInput markAsFinished];





//Start a session:
        [videoWriter startWriting];
        [videoWriter startSessionAtSourceTime:kCMTimeZero];

        CVPixelBufferRef buffer = NULL;

        //Write all picture array in movie file.

        int frameCount = 0;

        for(int i = 0; i<[m_PictArray count]; i++)
        {
            buffer = [self pixelBufferFromCGImage:[[m_PictArray objectAtIndex:i] CGImage] andSize:size];


            BOOL append_ok = NO;
            int j = 0;
            while (!append_ok && j < 30) 
            {
                if (adaptor.assetWriterInput.readyForMoreMediaData) 
                {
                    printf("appending %d attemp %d
", frameCount, j);

                    CMTime frameTime = CMTimeMake(frameCount,(int32_t) 10);


                    append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                    CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
                    NSParameterAssert(bufferPool != NULL);

                    [NSThread sleepForTimeInterval:0.05];
                } 
                else 
                {
                    printf("adaptor not ready %d, %d
", frameCount, j);
                    [NSThread sleepForTimeInterval:0.1];
                }
                j++;
            }
            if (!append_ok) {
                printf("error appending image %d times %d
", frameCount, j);
            }
            frameCount++;
        }

        //Finish writing picture:
        [videoWriterInput markAsFinished];

我在电影文件中写完图片,我想在文件中复制音频,我这样做:

I finish write picture in movie file and I want copy audio in the file and I do this :

[audioReader startReading];

    [videoWriter startSessionAtSourceTime:kCMTimeZero];
    dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
    [audioWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^
     {
         NSLog(@"Request");
         NSLog(@"Asset Writer ready :%d",audioWriterInput.readyForMoreMediaData);
         while (audioWriterInput.readyForMoreMediaData) {
             NSLog(@"Ready");
             CMSampleBufferRef nextBuffer = [readerOutput copyNextSampleBuffer];
             if (nextBuffer) {
                 NSLog(@"NextBuffer");
                 [audioWriterInput appendSampleBuffer:nextBuffer];
             }
         }
     }
     ];

    [audioWriterInput markAsFinished];
    [videoWriter finishWriting];

但是音频文件的AssetWriterInput的状态总是NO".

However the status of AssetWriterInput of audio file is always "NO".

我的问题:如何使用 AVFoundation 将音频添加到视频文件中?

My question : How to add audio to a video file using AVFoundation?

所以如果我忘记了什么或者有什么问题,请有人帮助我.

So please can someone help me by telling me if I forget something or if something is wrong.

非常感谢

推荐答案

我终于找到了如何用图片数组和音频文件制作电影.所以如果你想做同样的事情,我把我的代码放在这里(注意记忆):

I finally found how make movie with a picture array and an audio file. So if you want do the same thing I put my code here (be careful to memory):

  • 首先用你的图片数组制作电影文件,使用 zoul 的帖子 这里:

-(void) writeImagesToMovieAtPath:(NSString *) path withSize:(CGSize) size
{
  NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
  NSArray *dirContents = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:documentsDirectoryPath error:nil];
  for (NSString *tString in dirContents) 
  {
    if ([tString isEqualToString:@"essai.mp4"]) 
    {
        [[NSFileManager defaultManager]removeItemAtPath:[NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,tString] error:nil];

    }
  }

  NSLog(@"Write Started");

  NSError *error = nil;

  AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
                                                          error:&error];    
  NSParameterAssert(videoWriter);

  NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                               nil];


  AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
                                         assetWriterInputWithMediaType:AVMediaTypeVideo
                                         outputSettings:videoSettings] retain];




  AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                 sourcePixelBufferAttributes:nil];

  NSParameterAssert(videoWriterInput);

  NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
  videoWriterInput.expectsMediaDataInRealTime = YES;
  [videoWriter addInput:videoWriterInput];
  //Start a session:
  [videoWriter startWriting];
  [videoWriter startSessionAtSourceTime:kCMTimeZero];


  //Video encoding

  CVPixelBufferRef buffer = NULL;

  //convert uiimage to CGImage.

  int frameCount = 0;

  for(int i = 0; i<[m_PictArray count]; i++)
  {
    buffer = [self pixelBufferFromCGImage:[[m_PictArray objectAtIndex:i] CGImage] andSize:size];


    BOOL append_ok = NO;
    int j = 0;
    while (!append_ok && j < 30) 
    {
        if (adaptor.assetWriterInput.readyForMoreMediaData) 
        {
            printf("appending %d attemp %d
", frameCount, j);

            CMTime frameTime = CMTimeMake(frameCount,(int32_t) 10);

            append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
            CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
            NSParameterAssert(bufferPool != NULL);

            [NSThread sleepForTimeInterval:0.05];
        } 
        else 
        {
            printf("adaptor not ready %d, %d
", frameCount, j);
            [NSThread sleepForTimeInterval:0.1];
        }
        j++;
    }
    if (!append_ok) 
    {
        printf("error appending image %d times %d
", frameCount, j);
    }
    frameCount++;
    CVBufferRelease(buffer);
  }

  [videoWriterInput markAsFinished];
  [videoWriter finishWriting];

  [videoWriterInput release];
  [videoWriter release];

  [m_PictArray removeAllObjects];

  NSLog(@"Write Ended"); 
}

  • 之后,您必须将电影文件和音频文件放在一起.要做到这一点,请按照我的代码:

  • After that you must put together movie file and audio file. To do this follow my code:

    -(void)CompileFilesToMakeMovie
    {
      AVMutableComposition* mixComposition = [AVMutableComposition composition];
    
      NSString* audio_inputFileName = @"deformed.caf";
      NSString* audio_inputFilePath = [Utilities documentsPath:audio_inputFileName];
      NSURL*    audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath];
    
      NSString* video_inputFileName = @"essai.mp4";
      NSString* video_inputFilePath = [Utilities documentsPath:video_inputFileName];
      NSURL*    video_inputFileUrl = [NSURL fileURLWithPath:video_inputFilePath];
    
      NSString* outputFileName = @"outputFile.mov";
      NSString* outputFilePath = [Utilities documentsPath:outputFileName];
      NSURL*    outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
    
      if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath]) 
        [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
    
    
    
      CMTime nextClipStartTime = kCMTimeZero;
    
      AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
      CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
      AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
      [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
    
      //nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
    
      AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
      CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
      AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
      [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];
    
    
    
      AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];   
      _assetExport.outputFileType = @"com.apple.quicktime-movie";
      _assetExport.outputURL = outputFileUrl;
    
      [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         [self saveVideoToAlbum:outputFilePath]; 
     }       
     ];  
    }
    

  • 抱歉,如果有泄漏,我正在做内存优化.

    Sorry if there are some leak, I'm doing the optimization of memory.

    这篇关于使用 AVAsset 制作带有图片数组和歌曲文件的电影文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

    查看全文
    登录 关闭
    扫码关注1秒登录
    发送“验证码”获取 | 15天全站免登陆