如何将CIFilter输出到Camera视图? [英] How to output a CIFilter to a Camera view?

查看:144
本文介绍了如何将CIFilter输出到Camera视图?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚开始使用Objective-C,我正在尝试创建一个简单的应用程序,它会显示带有模糊效果的摄像机视图。我让Camera输出与AVFoundation框架一起工作。现在,我正在尝试连接核心图像框架,但不知道如何,Apple文档让我感到困惑,在线搜索指南和教程导致没有结果。在此先感谢您的帮助。

I'm just starting out in Objective-C and I'm trying to create a simple app where it shows the camera view with a blur effect on it. I got the Camera output working with the AVFoundation framework. Now, I'm trying to hook up the Core image framework but to no knowledge how to, Apple documentation is confusing for me and searching for guides and tutorials online leads to no results. Thanks in advance for the help.

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>
@interface ViewController ()

@property (strong ,nonatomic) CIContext *context;

@end

@implementation ViewController
AVCaptureSession *session;
AVCaptureStillImageOutput *stillImageOutput;

-(CIContext *)context
{
    if(!_context)
    {
        _context = [CIContext contextWithOptions:nil];
    }
    return _context;
}
- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
}

-(void)viewWillAppear:(BOOL)animated{
    session = [[AVCaptureSession alloc] init];
    [session setSessionPreset:AVCaptureSessionPresetPhoto];

    AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error;
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:&error];

    if ([session canAddInput:deviceInput]) {
        [session addInput:deviceInput];
    }

    AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    [previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    CALayer *rootLayer = [[self view] layer];
    [rootLayer setMasksToBounds:YES];
    CGRect frame = self.imageView.frame;

    [previewLayer setFrame:frame];

    [previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];


    [rootLayer insertSublayer:previewLayer atIndex:0];

    stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [stillImageOutput setOutputSettings:outputSettings];

    [session addOutput:stillImageOutput];

    [session startRunning];     
}
@end


推荐答案

这是让你入门的东西。这是以下链接的代码的更新版本。

https://gist.github.com/eladb/9662102 < br>

Here's something to get you started. This is an updated version of the code from the following link.
https://gist.github.com/eladb/9662102

诀窍是使用 AVCaptureVideoDataOutputSampleBufferDelegate

使用此委托,您可以使用 imageWithCVPixelBuffer 从相机缓冲区构建 CIImage

The trick is to use the AVCaptureVideoDataOutputSampleBufferDelegate.
With this delegate, you can use imageWithCVPixelBuffer to construct a CIImage from your camera buffer.

现在虽然我正试图弄清楚如何减少滞后。我会尽快更新。

Right now though I'm trying to figure out how to reduce lag. I'll update asap.

更新:延迟现在很小,而且有些效果不明显。不幸的是,模糊似乎是最慢的之一。您可能需要查看 vImage

Update: Latency is now minimal, and on some effects unnoticeable. Unfortunately, it seems that blur is one of the slowest. You may want to look into vImage.

#import "ViewController.h"
#import <CoreImage/CoreImage.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController () {

}

@property (strong, nonatomic) CIContext *coreImageContext;
@property (strong, nonatomic) AVCaptureSession *cameraSession;
@property (strong, nonatomic) AVCaptureVideoDataOutput *videoOutput;
@property (strong, nonatomic) UIView *blurCameraView;
@property (strong, nonatomic) CIFilter *filter;
@property BOOL cameraOpen;

@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    self.blurCameraView = [[UIView alloc]initWithFrame:[[UIScreen mainScreen] bounds]];
    [self.view addSubview:self.blurCameraView];

    //setup filter
    self.filter = [CIFilter filterWithName:@"CIGaussianBlur"];
    [self.filter setDefaults];
    [self.filter setValue:@(3.0f) forKey:@"inputRadius"];

    [self setupCamera];
    [self openCamera];
    // Do any additional setup after loading the view, typically from a nib.
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

- (void)setupCamera
{
    self.coreImageContext = [CIContext contextWithOptions:@{kCIContextUseSoftwareRenderer : @(YES)}];

    // session
    self.cameraSession = [[AVCaptureSession alloc] init];
    [self.cameraSession setSessionPreset:AVCaptureSessionPresetLow];
    [self.cameraSession commitConfiguration];

    // input
    AVCaptureDevice *shootingCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *shootingDevice = [AVCaptureDeviceInput deviceInputWithDevice:shootingCamera error:NULL];
    if ([self.cameraSession canAddInput:shootingDevice]) {
        [self.cameraSession addInput:shootingDevice];
    }

    // video output
    self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    self.videoOutput.alwaysDiscardsLateVideoFrames = YES;
    [self.videoOutput setSampleBufferDelegate:self queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_HIGH, 0)];
    if ([self.cameraSession canAddOutput:self.videoOutput]) {
        [self.cameraSession addOutput:self.videoOutput];
    }

    if (self.videoOutput.connections.count > 0) {
        AVCaptureConnection *connection = self.videoOutput.connections[0];
        connection.videoOrientation = AVCaptureVideoOrientationPortrait;
    }

    self.cameraOpen = NO;
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    // turn buffer into an image we can manipulate
    CIImage *result = [CIImage imageWithCVPixelBuffer:imageBuffer];

    // filter
    [self.filter setValue:result forKey:@"inputImage"];

    // render image
    CGImageRef blurredImage = [self.coreImageContext createCGImage:self.filter.outputImage fromRect:result.extent];
    dispatch_async(dispatch_get_main_queue(), ^{
        self.blurCameraView.layer.contents = (__bridge id)blurredImage;
        CGImageRelease(blurredImage);
    });
}

- (void)openCamera {
    if (self.cameraOpen) {
        return;
    }

    self.blurCameraView.alpha = 0.0f;
    [self.cameraSession startRunning];
    [self.view layoutIfNeeded];

    [UIView animateWithDuration:3.0f animations:^{

        self.blurCameraView.alpha = 1.0f;

    }];

    self.cameraOpen = YES;
}

这篇关于如何将CIFilter输出到Camera视图?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆