AVFoundation - 反转AVAsset并输出视频文件 [英] AVFoundation - Reverse an AVAsset and output video file

查看:110
本文介绍了AVFoundation - 反转AVAsset并输出视频文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经看过几次这个问题,但他们似乎都没有任何有效的答案。

I've seen this question asked a few times, but none of them seem to have any working answers.

要求是反转并输出视频文件(不只是反向播放),保持与源视频相同的压缩,格式和帧速率。

The requirement is to reverse and output a video file (not just play it in reverse) keeping the same compression, format, and frame rate as the source video.

理想情况下,解决方案可以在内存或缓冲区中完成所有操作,并避免将帧生成到图像文件中(例如:使用 AVAssetImageGenerator )然后重新编译它(资源密集,不可靠的时序结果,帧/图像质量从原始版本改变等)。

Ideally, the solution would be able to do this all in memory or buffer and avoid generating the frames into image files (for ex: using AVAssetImageGenerator) and then recompiling it (resource intensive, unreliable timing results, changes in frame/image quality from original, etc.).

-

我的贡献:
这仍然不起作用,但到目前为止我尝试过的最好:

My contribution: This is still not working, but the best I've tried so far:


  • 使用 AVAssetReader 将示例帧读入 CMSampleBufferRef [] 数组。

  • 使用 AVAssetWriter 以相反的顺序将其写回。

  • 问题:似乎每个的时间安排框架保存在 CMSampleBufferRef 中,因此即使向后追加它们也行不通。

  • Ne xt,我尝试用反向/镜像帧交换每帧的定时信息。

  • 问题:这会导致 AVAssetWriter 的未知错误。

  • 下一步:我打算研究 AVAssetWriterInputPixelBufferAdaptor

  • Read in the sample frames into an array of CMSampleBufferRef[] using AVAssetReader.
  • Write it back in reverse order using AVAssetWriter.
  • Problem: Seems like timing for each frame is saved in the CMSampleBufferRef so even appending them backwards will not work.
  • Next, I tried swapping the timing information of each frame with reverse/mirror frame.
  • Problem: This causes an unknown error with AVAssetWriter.
  • Next Step: I'm going to look into AVAssetWriterInputPixelBufferAdaptor

- (AVAsset *)assetByReversingAsset:(AVAsset *)asset {
    NSURL *tmpFileURL = [NSURL URLWithString:@"/tmp/test.mp4"];    
    NSError *error;

    // initialize the AVAssetReader that will read the input asset track
    AVAssetReader *reader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] lastObject];

    AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:nil];
    [reader addOutput:readerOutput];
    [reader startReading];

    // Read in the samples into an array
    NSMutableArray *samples = [[NSMutableArray alloc] init];

    while(1) {
        CMSampleBufferRef sample = [readerOutput copyNextSampleBuffer];

        if (sample == NULL) {
            break;
        }

        [samples addObject:(__bridge id)sample];
        CFRelease(sample);
    }

    // initialize the the writer that will save to our temporary file.
    CMFormatDescriptionRef formatDescription = CFBridgingRetain([videoTrack.formatDescriptions lastObject]);
    AVAssetWriterInput *writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:nil sourceFormatHint:formatDescription];
    CFRelease(formatDescription);

    AVAssetWriter *writer = [[AVAssetWriter alloc] initWithURL:tmpFileURL
                                                      fileType:AVFileTypeMPEG4
                                                         error:&error];
    [writerInput setExpectsMediaDataInRealTime:NO];
    [writer addInput:writerInput];
    [writer startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp((__bridge CMSampleBufferRef)samples[0])];
    [writer startWriting];


    // Traverse the sample frames in reverse order
    for(NSInteger i = samples.count-1; i >= 0; i--) {
        CMSampleBufferRef sample = (__bridge CMSampleBufferRef)samples[i];

        // Since the timing information is built into the CMSampleBufferRef 
        // We will need to make a copy of it with new timing info. Will copy
        // the timing data from the mirror frame at samples[samples.count - i -1]

        CMItemCount numSampleTimingEntries;
        CMSampleBufferGetSampleTimingInfoArray((__bridge CMSampleBufferRef)samples[samples.count - i -1], 0, nil, &numSampleTimingEntries);
        CMSampleTimingInfo *timingInfo = malloc(sizeof(CMSampleTimingInfo) * numSampleTimingEntries);
        CMSampleBufferGetSampleTimingInfoArray((__bridge CMSampleBufferRef)sample, numSampleTimingEntries, timingInfo, &numSampleTimingEntries);

        CMSampleBufferRef sampleWithCorrectTiming;
        CMSampleBufferCreateCopyWithNewTiming(
                                              kCFAllocatorDefault,
                                              sample,
                                              numSampleTimingEntries,
                                              timingInfo,
                                              &sampleWithCorrectTiming);

        if (writerInput.readyForMoreMediaData)  {
            [writerInput appendSampleBuffer:sampleWithCorrectTiming];
        }

        CFRelease(sampleWithCorrectTiming);
        free(timingInfo);
    }

    [writer finishWriting];

    return [AVAsset assetWithURL:tmpFileURL];
}


推荐答案

在过去的几天里完成了这项工作并且能够让它正常工作。

Worked on this over the last few days and was able to get it working.

这里的源代码: http://www.andyhin.com/post/5/reverse-video-avfoundation

使用 AVAssetReader 读取样本/帧,提取图像/像素缓冲区,然后将其附加到显示时间镜架。

Uses AVAssetReader to read out the samples/frames, extracts the image/pixel buffer, and then appends it with the presentation time of the mirror frame.

这篇关于AVFoundation - 反转AVAsset并输出视频文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆