在iOS上处理视频时跳过帧 [英] Skip over frames while processing video on iOS

查看:212
本文介绍了在iOS上处理视频时跳过帧的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试处理本地视频文件,并且仅对像素数据进行一些分析.什么也没有输出.我当前的代码遍历视频的每一帧,但实际上我想一次跳过〜15帧以加快速度.有没有一种方法可以跳过帧而不解码它们?

I'm trying to process a local video file and simply do some analysis on the pixel data. Nothing is being output. My current code iterates through each frame of the video but I'd actually like to skip ~15 frames at a time to speed things up. Is there a way to skip over frames without decoding them?

在Ffmpeg中,我可以简单地调用av_read_frame而无需调用avcodec_decode_video2.

In Ffmpeg, I could simply call av_read_frame without calling avcodec_decode_video2.

提前谢谢!这是我当前的代码:

Thanks in advance! Here's my current code:

- (void) readMovie:(NSURL *)url
{

    [self performSelectorOnMainThread:@selector(updateInfo:) withObject:@"scanning" waitUntilDone:YES];

    startTime = [NSDate date];

    AVURLAsset * asset = [AVURLAsset URLAssetWithURL:url options:nil];

    [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:
     ^{
         dispatch_async(dispatch_get_main_queue(),
                        ^{



                            AVAssetTrack * videoTrack = nil;
                            NSArray * tracks = [asset tracksWithMediaType:AVMediaTypeVideo];
                            if ([tracks count] == 1)
                            {
                                videoTrack = [tracks objectAtIndex:0];

                                videoDuration = CMTimeGetSeconds([videoTrack timeRange].duration);

                                NSError * error = nil;

                                // _movieReader is a member variable
                                _movieReader = [[AVAssetReader alloc] initWithAsset:asset error:&error];
                                if (error)
                                    NSLog(@"%@", error.localizedDescription);       

                                NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
                                NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_420YpCbCr8Planar];

                                NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 

                                AVAssetReaderTrackOutput* output = [AVAssetReaderTrackOutput 
                                                         assetReaderTrackOutputWithTrack:videoTrack 
                                                         outputSettings:videoSettings];
                                output.alwaysCopiesSampleData = NO;

                                [_movieReader addOutput:output];

                                if ([_movieReader startReading])
                                {
                                    NSLog(@"reading started");

                                    [self readNextMovieFrame];
                                }
                                else
                                {
                                    NSLog(@"reading can't be started");
                                }
                            }
                        });
     }];
}


- (void) readNextMovieFrame
{
    //NSLog(@"readNextMovieFrame called");
    if (_movieReader.status == AVAssetReaderStatusReading)
    {
        //NSLog(@"status is reading");

        AVAssetReaderTrackOutput * output = [_movieReader.outputs objectAtIndex:0];
        CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer];
        if (sampleBuffer)
        { // I'm guessing this is the expensive part that we can skip if we want to skip frames
            CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

            // Lock the image buffer
            CVPixelBufferLockBaseAddress(imageBuffer,0); 

            // Get information of the image
            uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
            size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
            size_t width = CVPixelBufferGetWidth(imageBuffer);
            size_t height = CVPixelBufferGetHeight(imageBuffer); 

            // do my pixel analysis

            // Unlock the image buffer
            CVPixelBufferUnlockBaseAddress(imageBuffer,0);
            CFRelease(sampleBuffer);


            [self readNextMovieFrame];
        }
        else
        {
            NSLog(@"could not copy next sample buffer. status is %d", _movieReader.status);

            NSTimeInterval scanDuration = -[startTime timeIntervalSinceNow];

            float scanMultiplier = videoDuration / scanDuration;

            NSString* info = [NSString stringWithFormat:@"Done\n\nvideo duration: %f seconds\nscan duration: %f seconds\nmultiplier: %f", videoDuration, scanDuration, scanMultiplier];

            [self performSelectorOnMainThread:@selector(updateInfo:) withObject:info waitUntilDone:YES];
        }


    }
    else
    {
        NSLog(@"status is now %d", _movieReader.status);


    }

}


- (void) updateInfo: (id*)message
{
    NSString* info = [NSString stringWithFormat:@"%@", message];

    [infoTextView setText:info];
}

推荐答案

如果您想要不太精确的帧处理(而不是逐帧),则应使用AVAssetImageGenerator.

If you want less accurate frame processing (not frame by frame) you should use AVAssetImageGenerator.

此类返回您要求的指定时间的帧.

This class returns a frame for a specified time you asked.

具体来说,构建一个数组,该数组填充片段持续时间之间的时间,每次之间相差0.5s(如果您希望每15帧每15帧约30秒,则iPhone影片的速度约为29.3 fps),然后让图像生成器返回您的框架.

Specifically, build an Array filled with times between the clip's duration with 0.5s difference between each time (iPhone films at about 29.3 fps if you want every 15 frames its about frame for every 30 seconds) and let the image generator returns your frames.

对于每个帧,您都可以看到您请求的时间和该帧的实际时间.从您要求的时间起,它的默认值大约为0.5s公差,但是您也可以通过更改属性来更改该值:

For each frame you can see the time you requested and the actual time of the frame. It's default value is around 0.5s tolerance from the time you asked but you can also change that by changing the properties:

requestedTimeToleranceBeforerequestedTimeToleranceAfter

我希望我回答了你的问题, 祝你好运.

I hope I answered your question, Good luck.

这篇关于在iOS上处理视频时跳过帧的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆