iPhone中的视频过滤速度很慢 [英] Video filtering in iPhone is slow

查看:108
本文介绍了iPhone中的视频过滤速度很慢的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在iPhone中过滤视频。这是我的程序结构和源代码:

I am trying to filter video in iPhone. Here's my program structure and source code:

AppDelegate.h
AppDelegate.m
ViewController.h
ViewController.m

AppDelegate文件与默认文件相同。这是我的ViewController。

The AppDelegate file is same as default. Here's my ViewController.

//ViewController.h

#import <UIKit/UIKit.h>
#import <GLKit/GLKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreMedia/CoreMedia.h>
#import <CoreVideo/CoreVideo.h>
#import <QuartzCore/QuartzCore.h>
#import <CoreImage/CoreImage.h>
#import <ImageIO/ImageIO.h>

@interface ViewController : GLKViewController <AVCaptureVideoDataOutputSampleBufferDelegate>{
    AVCaptureSession *avCaptureSession;
    CIContext *coreImageContext;
    CIImage *maskImage;
    CGSize screenSize;
    CGContextRef cgContext;
    GLuint _renderBuffer;
    float scale;
}

@property (strong, nonatomic) EAGLContext *context;

-(void)setupCGContext;

@end

// ViewController.m
#import "ViewController.h"

@implementation ViewController

@synthesize context;

- (void)viewDidLoad
{
    [super viewDidLoad];
    self.context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
    if (!self.context) {
        NSLog(@"Failed to create ES context");
    }

    GLKView *view = (GLKView *)self.view;
    view.context = self.context;
    view.drawableDepthFormat = GLKViewDrawableDepthFormat24;

    coreImageContext = [CIContext contextWithEAGLContext:self.context];

    glGenRenderbuffers(1, &_renderBuffer);
    glBindRenderbuffer(GL_RENDERBUFFER, _renderBuffer);

    NSError *error;
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];

    [dataOutput setAlwaysDiscardsLateVideoFrames:YES]; 
    [dataOutput setVideoSettings:[NSDictionary  dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] 
                                                              forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
    [dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];

    avCaptureSession = [[AVCaptureSession alloc] init];
    [avCaptureSession beginConfiguration];
    [avCaptureSession setSessionPreset:AVCaptureSessionPreset1280x720];
    [avCaptureSession addInput:input];
    [avCaptureSession addOutput:dataOutput];
    [avCaptureSession commitConfiguration];
    [avCaptureSession startRunning];

    [self setupCGContext];
    CGImageRef cgImg = CGBitmapContextCreateImage(cgContext);
    maskImage = [CIImage imageWithCGImage:cgImg];
    CGImageRelease(cgImg);
}

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {

    CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
    CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer];
    image = [CIFilter   filterWithName:@"CISepiaTone" keysAndValues:kCIInputImageKey, 
                        image, @"inputIntensity", 
                        [NSNumber numberWithFloat:0.8], 
                        nil].outputImage;

    [coreImageContext drawImage:image atPoint:CGPointZero fromRect:[image extent] ];

    [self.context presentRenderbuffer:GL_RENDERBUFFER];
}

-(void)setupCGContext {
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * screenSize.width;
    NSUInteger bitsPerComponent = 8;
    cgContext = CGBitmapContextCreate(NULL, screenSize.width, screenSize.height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast);

    CGColorSpaceRelease(colorSpace);
}

棕褐色滤镜有效,但视频速度稍慢。当我不应用过滤器时,视频是正常的。关于如何改进视频并加快速度的想法?

The sepia filter works, but the video is little slower. When I don't apply filter, the video is normal. Any idea on how I can improve the video and make it faster?

谢谢。

推荐答案

正如我在这里描述 ,Core Image中的棕褐色过滤器无法实时运行,但其他过滤器可能。这取决于目标设备的硬件功能,以及iOS版本(Core Image在过去几个iOS版本中的性能显着提升)。

As I describe here, the sepia filter in Core Image wasn't quite able to run in realtime, but other filters might. It depends on the hardware capabilities of the target device, as well as iOS version (Core Image has improved in performance significantly over the last several iOS versions).

但是,如果我可以再次插入我的开源框架, GPUImage 可以让你做得更快,更快。它可以在iPhone 4上以2.5毫秒的速度在640x480视频帧上应用棕褐色调滤波器,这对于来自该相机的30 FPS视频来说足够快。

However, if I may plug my open source framework again, GPUImage lets you do this much, much faster. It can apply a sepia tone filter on a 640x480 frame of video in 2.5 ms on an iPhone 4, which is more than fast enough for the 30 FPS video from that camera.

以下代码将对iOS设备上后置摄像头的视频进行实时过滤,在纵向视图中显示该视频:

The following code will do a live filtering of video from the rear-mounted camera on an iOS device, displaying that video within a portrait-oriented view:

videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];

sepiaFilter = [[GPUImageSepiaFilter alloc] init];
GPUImageRotationFilter *rotationFilter = [[GPUImageRotationFilter alloc] initWithRotation:kGPUImageRotateRight];

[videoCamera addTarget:rotationFilter];
[rotationFilter addTarget:sepiaFilter];
filterView = [[GPUImageView alloc] initWithFrame:self.view.bounds];
[self.view addSubview:filterView];
[sepiaFilter addTarget:filterView];

[videoCamera startCameraCapture];

这篇关于iPhone中的视频过滤速度很慢的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆