AVFoundation - 重新定义CMSampleBufferRef视频输出 [英] AVFoundation - Retiming CMSampleBufferRef Video Output

查看:4152
本文介绍了AVFoundation - 重新定义CMSampleBufferRef视频输出的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

第一次在这里提问。我希望帖子清晰,示例代码的格式正确。



我正在尝试AVFoundation和时间推移摄影。



我的目的是从iOS设备(我的iPod touch,版本4)的摄像机抓取每个第N帧,并将每个帧写入一个文件以创建一个时间拍摄。我使用AVCaptureVideoDataOutput,AVAssetWriter和AVAssetWriterInput。



问题是,如果我使用CMSampleBufferRef传递到

 captureOutput:idOutputSampleBuffer:fromConnection: pre>,每帧的播放是原始输入帧之间的时间长度。帧速率为1fps。 



我尝试使用

 CMSampleBufferCreateCopyWithNewTiming()

,但是在13帧之后被写入文件,

 captureOutput:idOutputSampleBuffer:fromConnection:

停止被调用。该接口处于活动状态,我可以点击一个按钮停止捕获,并将其保存到照片库中进行播放。



我如何实现30fps的播放目标?
我如何知道应用程序丢失的位置和原因?



我放了一个名为useNativeTime的标志,所以我可以测试这两种情况。当设置为YES时,我获得所有感兴趣的帧,因为回调不会丢失。当我设置那个标志为NO,我只有13帧处理,并且从来没有返回到该方法。如上所述,在这两种情况下,我都可以播放视频。



感谢您的帮助。



   - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef) sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{
BOOL useNativeTime = NO;
BOOL appendSuccessFlag = NO;

// NSLog(@captureOutpput sample buffer method);
if(!CMSampleBufferDataIsReady(sampleBuffer))
{
NSLog(@sample buffer is not ready。Skipping sample);
// CMSampleBufferInvalidate(sampleBuffer);
return;
}

if(![inputWriterBuffer isReadyForMoreMediaData])
{
NSLog(@Not ready for data。
}
else {
//写n帧的每个第一帧(从摄像机30原生)。
intervalFrames ++;
if(intervalFrames> 30){
intervalFrames = 1;
}
else if(intervalFrames!= 1){
// CMSampleBufferInvalidate(sampleBuffer);
return;
}

//需要初始化开始会话时间。
if(writtenFrames< 1){
if(useNativeTime)imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
else imageSourceTime = CMTimeMake(0 * 20,600); // CMTimeMake(1,30);
[outputWriter startSessionAtSourceTime:imageSourceTime];
NSLog(@Starting CMtime);
CMTimeShow(imageSourceTime);
}

if(useNativeTime){
imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
CMTimeShow(imageSourceTime);
// CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
// CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer,myTiming); //试了,但没有影响。
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
}
else {
CMSampleBufferRef newSampleBuffer;
CMSampleTimingInfo sampleTimingInfo;
sampleTimingInfo.duration = CMTimeMake(20,600);
sampleTimingInfo.presentationTimeStamp = CMTimeMake((writtenFrames + 0)* 20,600);
sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
OSStatus myStatus;

// NSLog(@numSamples of sampleBuffer:%i,CMSampleBufferGetNumSamples(sampleBuffer));
myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
sampleBuffer,
1,
& sampleTimingInfo,//可能对这个参数有点困惑
& newSampleBuffer);
//这些确认了我们的newSampleBuffer的好处。
if(myStatus!= 0)NSLog(@CMSampleBufferCreateCopyWithNewTiming()myStatus:%i,myStatus);
if(!CMSampleBufferIsValid(newSampleBuffer))NSLog(@CMSampleBufferIsValid NOT!);

//没有影响。
// myStatus = CMSampleBufferMakeDataReady(newSampleBuffer); //这是什么不同? CMSampleBufferSetDataReady?
// if(myStatus!= 0)NSLog(@CMSampleBufferMakeDataReady()myStatus:%i,myStatus);

imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
CMTimeShow(imageSourceTime);
appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
// CMSampleBufferInvalidate(sampleBuffer); //文档不描述操作。 WTF做吗?似乎没有影响我的问题。与CMSampleBufferSetInvalidateCallback可能使用?
// CFRelease(sampleBuffer); // - 不出意料 - EXC_BAD_ACCESS
}

if(!appendSuccessFlag)
{
NSLog(@无法添加像素缓冲区);
}
else {
writtenFrames ++;
NSLog(@writtenFrames:%i,writtenFrames);
}
}

// [self displayOuptutWritterStatus]; //期望并查看AVAssetWriterStatusWriting。
}

我的设置程序。

   - (IBAction)recordingStartStop:(id)sender 
{
NSError * error;

if(self.isRecording){
NSLog(@~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~);
self.isRecording = NO;
[recordingStarStop setTitle:@RecordforState:UIControlStateNormal];

//[self.captureSession stopRunning];
[inputWriterBuffer markAsFinished];
[outputWriter endSessionAtSourceTime:imageSourceTime];
[outputWriter finishWriting]; //阻塞,直到文件被完全写入,或发生错误。
NSLog(@finished CMtime);
CMTimeShow(imageSourceTime);

//真的,我应该循环通过输出并关闭所有的或目标特定的。
//因为我现在只记录视频,我觉得安全这样做。
[self.captureSession removeOutput:[[self.captureSession outputs] objectAtIndex:0]];

[videoOutput release];
[inputWriterBuffer release];
[outputWriter release];
videoOutput = nil;
inputWriterBuffer = nil;
outputWriter = nil;
NSLog(@~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~);
NSLog(@Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum。);
NSLog(@filePath:%@,[projectPaths movieFilePath]);
if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])){
NSLog(@Calling UISaveVideoAtPathToSavedPhotosAlbum。);
UISaveVideoAtPathToSavedPhotosAlbum([projectPaths movieFilePath],self,@selector(video:didFinishSavingWithError:contextInfo :),nil);
}
NSLog(@~~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~);
}
else {
NSLog(@~~~~~~~~~ STARTING RECORDING ~~~~~~~~~);
projectPaths = [[ProjectPaths alloc] initWithProjectFolder:@TestProject];
intervalFrames = 30;

videoOutput = [[AVCaptureVideoDataOutput alloc] init];
NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
NSString * key =(NSString *)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; // kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
[cameraVideoSettings setValue:value forKey:key];
[videoOutput setVideoSettings:cameraVideoSettings];
[videoOutput setMinFrameDuration:CMTimeMake(20,600)]; // CMTimeMake(1,30)]; // 30fps
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];

queue = dispatch_queue_create(cameraQueue,NULL);
[videoOutput setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

NSMutableDictionary * outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
[outputSettings setValue:AVVideoCodecH264 forKey:AVVideoCodecKey];
[outputSettings setValue:[NSNumber numberWithInt:1280] forKey:AVVideoWidthKey]; //目前假设
[outputSettings setValue:[NSNumber numberWithInt:720] forKey:AVVideoHeightKey];

NSMutableDictionary * compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
[compressionSettings setValue:AVVideoProfileLevelH264Main30 forKey:AVVideoProfileLevelKey];
// [compressionSettings setValue:[NSNumber numberWithDouble:1024.0 * 1024.0] forKey:AVVideoAverageBitRateKey];
[outputSettings setValue:compressionSettings forKey:AVVideoCompressionPropertiesKey];

inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
[inputWriterBuffer retain];
inputWriterBuffer.expectsMediaDataInRealTime = YES;

outputWriter = [AVAssetWriter assetWriterWithURL:[projectPaths movieURLPath] fileType:AVFileTypeQuickTimeMovie error:& error];
[outputWriter retain];

if(error)NSLog(@outputWriter的错误= [AVAssetWriter assetWriterWithURL:fileType:error:);
if([outputWriter canAddInput:inputWriterBuffer])[outputWriter addInput:inputWriterBuffer];
else NSLog(@无法添加输入);

if(![outputWriter canApplyOutputSettings:outputSettings forMediaType:AVMediaTypeVideo])NSLog(@ouptutSettings are not supported);

if([captureSession canAddOutput:videoOutput])[self.captureSession addOutput:videoOutput];
else NSLog(@could not addOutput:videoOutput to captureSession);

//[self.captureSession startRunning];
self.isRecording = YES;
[recordingStarStop setTitle:@StopforState:UIControlStateNormal];

writtenFrames = 0;
imageSourceTime = kCMTimeZero;
[outputWriter startWriting];
// [outputWriter startSessionAtSourceTime:imageSourceTime];
NSLog(@~~~~~~~~~ STARTED RECORDING ~~~~~~~~~);
NSLog(@recording to fileURL:%@,[projectPaths movieURLPath]);
}

NSLog(@isRecording:%@,self.isRecording?@YES:@NO);

[self displayOuptutWritterStatus];
}


解决方案

使用

  myStatus = CMSampleBufferCreateCopyWithNewTiming kCFAllocatorDefault,
sampleBuffer,
1,
& sampleTimingInfo,
& newSampleBuffer);

您需要与 CFRelease(newSampleBuffer);



当使用CVPixelBufferRef和AVAssetWriterInputPixelBufferAdaptor实例的piexBufferPool时,同样的想法是正确的。您可以在调用 appendPixelBuffer:withPresentationTime:方法后使用 CVPixelBufferRelease(yourCVPixelBufferRef);



希望这对其他人有帮助。


First time asking a question here. I'm hoping the post is clear and sample code is formatted correctly.

I'm experimenting with AVFoundation and time lapse photography.

My intent is to grab every Nth frame from the video camera of an iOS device (my iPod touch, version 4) and write each of those frames out to a file to create a timelapse. I'm using AVCaptureVideoDataOutput, AVAssetWriter and AVAssetWriterInput.

The problem is, if I use the CMSampleBufferRef passed to

captureOutput:idOutputSampleBuffer:fromConnection:

, the playback of each frame is the length of time between original input frames. A frame rate of say 1fps. I'm looking to get 30fps.

I've tried using

CMSampleBufferCreateCopyWithNewTiming()

, but then after 13 frames are written to the file, the

captureOutput:idOutputSampleBuffer:fromConnection:

stops being called. The interface is active and I can tap a button to stop the capture and save it to the photo library for playback. It appears to play back as I want it, 30fps, but it only has those 13 frames.

How can I accomplish my goal of 30fps playback? How can I tell where the app is getting lost and why?

I've placed a flag called useNativeTime so I can test both cases. When set to YES, I get all frames I'm interested in as the callback doesn't 'get lost'. When I set that flag to NO, I only ever get 13 frames processed and am never returned to that method again. As mentioned above, in both cases I can playback the video.

Thanks for any help.

Here is where I'm trying to do the retiming.

    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    BOOL useNativeTime = NO;
    BOOL appendSuccessFlag = NO;

    //NSLog(@"in captureOutpput sample buffer method");
    if( !CMSampleBufferDataIsReady(sampleBuffer) )
    {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        //CMSampleBufferInvalidate(sampleBuffer);
        return;
    }

    if (! [inputWriterBuffer isReadyForMoreMediaData])
    {
        NSLog(@"Not ready for data.");
    }
    else {
        // Write every first frame of n frames (30 native from camera). 
        intervalFrames++;
        if (intervalFrames > 30) {
            intervalFrames = 1;
        }
        else if (intervalFrames != 1) {
            //CMSampleBufferInvalidate(sampleBuffer);
            return;
        }

        // Need to initialize start session time.
        if (writtenFrames < 1) {
            if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
            [outputWriter startSessionAtSourceTime: imageSourceTime];
            NSLog(@"Starting CMtime");
            CMTimeShow(imageSourceTime);
        }

        if (useNativeTime) {
            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            CMTimeShow(imageSourceTime);
            // CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
            // CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); // Tried but has no affect.
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
        }
        else {
            CMSampleBufferRef newSampleBuffer;
            CMSampleTimingInfo sampleTimingInfo;
            sampleTimingInfo.duration = CMTimeMake(20,600);
            sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
            sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
            OSStatus myStatus;

            //NSLog(@"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
            myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                             sampleBuffer,
                                                             1,
                                                             &sampleTimingInfo, // maybe a little confused on this param.
                                                             &newSampleBuffer);
            // These confirm the good heath of our newSampleBuffer.
            if (myStatus != 0) NSLog(@"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
            if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(@"CMSampleBufferIsValid NOT!");

            // No affect.
            //myStatus = CMSampleBufferMakeDataReady(newSampleBuffer);  // How is this different; CMSampleBufferSetDataReady ?
            //if (myStatus != 0) NSLog(@"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);

            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
            CMTimeShow(imageSourceTime);
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
            //CMSampleBufferInvalidate(sampleBuffer); // Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
            //CFRelease(sampleBuffer); // - Not surprisingly - "EXC_BAD_ACCESS"
        }

        if (!appendSuccessFlag)
        {
            NSLog(@"Failed to append pixel buffer");
        }
        else {
            writtenFrames++;
            NSLog(@"writtenFrames: %i", writtenFrames);
            }
    }

    //[self displayOuptutWritterStatus];    // Expect and see AVAssetWriterStatusWriting.
}

My setup routine.

    - (IBAction) recordingStartStop: (id) sender
{
    NSError * error;

    if (self.isRecording) {
        NSLog(@"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
        self.isRecording = NO;
        [recordingStarStop setTitle: @"Record" forState: UIControlStateNormal];

        //[self.captureSession stopRunning];
        [inputWriterBuffer markAsFinished];
        [outputWriter endSessionAtSourceTime:imageSourceTime];
        [outputWriter finishWriting]; // Blocks until file is completely written, or an error occurs.
        NSLog(@"finished CMtime");
        CMTimeShow(imageSourceTime);

        // Really, I should loop through the outputs and close all of them or target specific ones.
        // Since I'm only recording video right now, I feel safe doing this.
        [self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];

        [videoOutput release];
        [inputWriterBuffer release];
        [outputWriter release];
        videoOutput = nil;
        inputWriterBuffer = nil;
        outputWriter = nil;
        NSLog(@"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
        NSLog(@"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
        NSLog(@"filePath: %@", [projectPaths movieFilePath]);
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
            NSLog(@"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
            UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, @selector(video:didFinishSavingWithError: contextInfo:), nil);
        }
        NSLog(@"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
    }
    else {
        NSLog(@"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
        projectPaths = [[ProjectPaths alloc] initWithProjectFolder: @"TestProject"];
        intervalFrames = 30;

        videoOutput = [[AVCaptureVideoDataOutput alloc] init];
        NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
        NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
        NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
        [cameraVideoSettings setValue: value forKey: key];
        [videoOutput setVideoSettings: cameraVideoSettings];
        [videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; // 30fps
        [videoOutput setAlwaysDiscardsLateVideoFrames: YES];

        queue = dispatch_queue_create("cameraQueue", NULL);
        [videoOutput setSampleBufferDelegate: self queue: queue];
        dispatch_release(queue);

        NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
        [outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; // currently assuming
        [outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];

        NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
        //[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
        [outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];

        inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
        [inputWriterBuffer retain];
        inputWriterBuffer.expectsMediaDataInRealTime = YES;

        outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
        [outputWriter retain];

        if (error) NSLog(@"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
        if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
        else NSLog(@"can not add input");

        if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(@"ouptutSettings are NOT supported");

        if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
        else NSLog(@"could not addOutput: videoOutput to captureSession");

        //[self.captureSession startRunning];
        self.isRecording = YES;
        [recordingStarStop setTitle: @"Stop" forState: UIControlStateNormal];

        writtenFrames = 0;
        imageSourceTime = kCMTimeZero;
        [outputWriter startWriting];
        //[outputWriter startSessionAtSourceTime: imageSourceTime];
        NSLog(@"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
        NSLog (@"recording to fileURL: %@", [projectPaths movieURLPath]);
    }

    NSLog(@"isRecording: %@", self.isRecording ? @"YES" : @"NO");

    [self displayOuptutWritterStatus];  
}

解决方案

OK, I found the bug in my first post.

When using

myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                 sampleBuffer,
                                                 1,
                                                 &sampleTimingInfo, 
                                                 &newSampleBuffer);

you need to balance that with a CFRelease(newSampleBuffer);

The same idea holds true when using a CVPixelBufferRef with a piexBufferPool of an AVAssetWriterInputPixelBufferAdaptor instance. You would use CVPixelBufferRelease(yourCVPixelBufferRef); after calling the appendPixelBuffer: withPresentationTime: method.

Hope this is helpful to someone else.

这篇关于AVFoundation - 重新定义CMSampleBufferRef视频输出的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆