在后台捕获图像? [英] Capture images in the background?

查看:138
本文介绍了在后台捕获图像?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试从相机在背景中捕获图像,而无需加载相机或预览界面。



在我的应用程序中,照片是在没有预览屏幕的背景下拍摄的,只是正常的应用屏幕,然后向用户显示。



有人可以指出正确的方向吗?

解决方案

你必须使用AVCaptureSession& AVCaptureDeviceInput。



这是代码的一部分可能会帮助您:

  @interface MyViewController:UIViewController 
{
AVCaptureStillImageOutput * _output;
AVCaptureConnection * _videoConnection;
bool _isCaptureSessionStarted;
}

@property(retain,nonatomic)AVCaptureDevice * frontalCamera;

- (void)takePhoto;实施:

  

code> @interface MyViewController()

@end

@implementation MyViewController

- (id)initWithNibName(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if(self){
_isCaptureSessionStarted = false;
}
return self;
}

- (void)viewDidLoad
{
[super viewDidLoad];

//查找正面摄像头
NSArray * cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

for(int i = 0; i AVCaptureDevice * camera = [cameras objectAtIndex:i];

if(camera.position == AVCaptureDevicePositionFront){
self.frontalCamera = camera;

[self.frontalCamera addObserver:self forKeyPath:@adjustingExposureoptions:NSKeyValueObservingOptionNew context:nil];
[self.frontalCamera addObserver:self forKeyPath:@adjustingWhiteBalanceoptions:NSKeyValueObservingOptionNew context:nil];
}
}
}

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context :( void *)context
{
if(!self.frontalCamera.adjustingExposure&&!self.frontalCamera.adjustingWhiteBalance){
if(_isCaptureSessionStarted){
[self captureStillImage] ;
}
}
}

- (void)takePhoto
{
if(self.frontalCamera!= nil){
//将相机添加到会话
AVCaptureSession * session = [[AVCaptureSession alloc] init];

NSError * error;
AVCaptureDeviceInput * input = [AVCaptureDeviceInput deviceInputWithDevice:self.frontalCamera error:& error];

if(!error&& [session canAddInput:input]){
[session addInput:input];

//捕获静止图像
_output = [[AVCaptureStillImageOutput alloc] init];

//捕获的图像设置
[_output setOutputSettings:[[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];

if([session canAddOutput:_output]){
[session addOutput:_output];

_videoConnection = nil;

for(AVCaptureConnection * connection in _output.connections){
for(AVCaptureInputPort * port in [connection inputPorts]){
if([[media mediaType] isEqual:AVMediaTypeVideo] ){
_videoConnection = connection;
break;
}
}
if(_videoConnection){
断点;
}
}

if(_videoConnection){
[session startRunning];
NSLock * lock = [[[NSLock alloc] init] autorelease];
[lock lock];
_isCaptureSessionStarted = true;
[lock unlock];
}
}
} else {
NSLog(@%@,[error localizedDescription]);
}
}
}

- (void)captureStillImage
{
[_output captureStillImageAsynchronouslyFromConnection:_videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer,NSError * error){

NSLock * lock = [[[NSLock alloc] init] autorelease];
[lock lock];
_isCaptureSessionStarted = false;
[lock unlock];

if(imageDataSampleBuffer!= NULL){
NSData * bitmap = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];

//你可以通过[[UIImage alloc] initWithData:bitmap]
}
}获得图像。
}


I'm trying to capture images in the background from the camera without loading a camera or preview interface.

In my app photos are taken in the background with no preview screen, just the normal app screen and then shown to the user later.

Can someone please point me in the right direction?

解决方案

You have to use AVCaptureSession & AVCaptureDeviceInput.

This is part of code may help you:

@interface MyViewController : UIViewController
{
    AVCaptureStillImageOutput *_output;
    AVCaptureConnection *_videoConnection;
    bool _isCaptureSessionStarted;
}

@property (retain, nonatomic) AVCaptureDevice *frontalCamera;

- (void)takePhoto;

Implementation:

@interface MyViewController ()

@end

@implementation MyViewController

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
    if (self) {
        _isCaptureSessionStarted = false;
    }
    return self;
}

- (void)viewDidLoad
{
    [super viewDidLoad];

    // Finding frontal camera
    NSArray *cameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (int i = 0; i < cameras.count; i++) {
        AVCaptureDevice *camera = [cameras objectAtIndex:i];

        if (camera.position == AVCaptureDevicePositionFront) {
            self.frontalCamera = camera;

            [self.frontalCamera addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:nil];
            [self.frontalCamera addObserver:self forKeyPath:@"adjustingWhiteBalance" options:NSKeyValueObservingOptionNew context:nil];
        }
    }
}

- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
{
    if (!self.frontalCamera.adjustingExposure && !self.frontalCamera.adjustingWhiteBalance) {
        if (_isCaptureSessionStarted) {
            [self captureStillImage];
        }
    }
}

- (void)takePhoto
{
    if (self.frontalCamera != nil) {
        // Add camera to session
        AVCaptureSession *session = [[AVCaptureSession alloc] init];

        NSError *error;
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:self.frontalCamera error:&error];

        if (!error && [session canAddInput:input]) {
            [session addInput:input];

            // Capture still image
            _output = [[AVCaptureStillImageOutput alloc] init];

            // Captured image settings
            [_output setOutputSettings:[[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil]];

            if ([session canAddOutput:_output]) {
                [session addOutput:_output];

                _videoConnection = nil;

                for (AVCaptureConnection *connection in _output.connections) {
                    for (AVCaptureInputPort *port in [connection inputPorts]) {
                        if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
                            _videoConnection = connection;
                            break;
                        }
                    }
                    if (_videoConnection) {
                        break;
                    }
                }

                if (_videoConnection) {
                    [session startRunning];
                    NSLock *lock = [[[NSLock alloc] init] autorelease];
                    [lock lock];
                    _isCaptureSessionStarted = true;
                    [lock unlock];
                }
            }
        } else {
            NSLog(@"%@",[error localizedDescription]);
        }
    }
}

- (void) captureStillImage
{
    [_output captureStillImageAsynchronouslyFromConnection:_videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

        NSLock *lock = [[[NSLock alloc] init] autorelease];
        [lock lock];
        _isCaptureSessionStarted = false;
        [lock unlock];

        if (imageDataSampleBuffer != NULL) {
            NSData *bitmap = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];

//            You can get image here via [[UIImage alloc] initWithData:bitmap]
        }
    }];
}

这篇关于在后台捕获图像?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆