如何用AVFoundation框架捕获图像? [英] How to capture image with AVFoundation framework?

查看:157
本文介绍了如何用AVFoundation框架捕获图像?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下代码在UIView打开相机,这是工作的现在。



但我有两个按钮像屏幕截图用于捕获照片,另一张用于从库中上传照片。





这是我的.h文件代码

  #import< UIKit / UIKit.h> 
#import< AVFoundation / AVFoundation.h>

@interface bgCameraController:UIViewController< AVCaptureMetadataOutputObjectsDelegate>

@property(weak,nonatomic)IBOutlet UIView * cam;
@property(strong,nonatomic)IBOutlet UIImageView * imageView;

- (IBAction)takePhoto:(UIButton *)sender;
- (IBAction)selectPhoto:(UIButton *)sender;
@end

这是我的.m文件代码



#importbgCameraController.h

@interface bgCameraController()
@property(nonatomic,strong)AVCaptureSession * captureSession ;
@property(nonatomic,strong)AVCaptureVideoPreviewLayer * videoPreviewLayer;
@property(nonatomic,strong)AVAudioPlayer * audioPlayer;
@property(nonatomic)BOOL isReading;

- (BOOL)startReading;
- (void)stopReading;
- (void)loadBeepSound;
@end

@implementation bgCameraController

- (void)viewDidLoad {
[super viewDidLoad];
[self loadBeepSound]
[self startReading];

//加载视图后进行任何其他设置。
}

- (void)didReceiveMemoryWarning {
[super didReceiveMemoryWarning];
//处理可以重新创建的任何资源。
}

- (BOOL)startReading {
NSError * error;

//获取AVCaptureDevice类的实例以初始化设备对象,并提供视频
//作为媒体类型参数。
AVCaptureDevice * captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

//使用上一个设备对象获取AVCaptureDeviceInput类的实例。
AVCaptureDeviceInput * input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:& error];

if(!input){
//如果发生任何错误,只需记录它的描述,不再继续。
NSLog(@%@,[error localizedDescription]);
return NO;
}

//初始化captureSession对象。
_captureSession = [[AVCaptureSession alloc] init];
//在捕获会话上设置输入设备。
[_captureSession addInput:input];


//初始化AVCaptureMetadataOutput对象并将其设置为捕获会话的输出设备。
AVCaptureMetadataOutput * captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];

//创建一个新的串行调度队列。
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create(myQueue,NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];

//初始化视频预览图层,并将其作为子图层添加到viewPreview视图的图层。
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_cam.layer.bounds];
[_cam.layer addSublayer:_videoPreviewLayer];


//开始视频捕获。
[_captureSession startRunning];

return YES;
}


- (void)stopReading {
//停止视频捕获并使捕获会话对象为零。
[_captureSession stopRunning];
_captureSession = nil;

//从viewPreview视图的图层中删除视频预览图层。
// [_ videoPreviewLayer removeFromSuperlayer];
}


- (void)loadBeepSound {
//获取beep.mp3文件的路径并将其转换为NSURL对象。
NSString * beepFilePath = [[NSBundle mainBundle] pathForResource:@beepofType:@mp3];
NSURL * beepURL = [NSURL URLWithString:beepFilePath];

NSError * error;

//使用先前设置的NSURL对象初始化音频播放器对象。
_audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:beepURL error:& error];
if(error){
//如果音频播放器无法初始化,请记录一条消息。
// NSLog(@无法播放蜂鸣声文件。
// NSLog(@%@,[error localizedDescription]);
}
else {
//如果音频播放器已成功初始化,然后将其加载到内存中。
[_audioPlayer prepareToPlay];
}
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection {

检查metadataObjects数组是否不为nil,并且它至少包含一个对象。
if(metadataObjects!= nil&& [metadataObjects count]> 0){
//获取元数据对象。
// NSLog(@%@,metadataObjects);
AVMetadataMachineReadableCodeObject * metadataObj = [metadataObjects objectAtIndex:0];
if([[metadataObj type] isEqualToString:AVMetadataObjectTypeQRCode]){
//如果找到的元数据等于QR代码元数据,然后更新状态标签的文本
//停止读取和更改条形按钮项目的标题和标志的值。
//一切都在主线程上完成。
NSString * result = [metadataObj stringValue];
[self performSelectorOnMainThread:@selector(setQRcodeValues :) withObject:result waitUntilDone:NO];
// [_result performSelectorOnMainThread:@selector(setText :) withObject:[metadataObj stringValue] waitUntilDone:NO];

[self performSelectorOnMainThread:@selector(stopReading)withObject:nil waitUntilDone:NO];
// [_button performSelectorOnMainThread:@selector(setTitle :) withObject:@开始! waitUntilDone:NO];

_isReading = NO;

//如果音频播放器不是nil,则播放声音效果。
if(_audioPlayer){
[_audioPlayer play];
}
}
}


}

/ *
#pragma mark - Navigation

//在一个基于storyboard的应用程序中,你通常会在导航之前做一些准备
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
//使用[segue destinationViewController]获取新的视图控制器。
//将所选对象传递给新的视图控制器。
}
* /

@end

请帮助我...点击该按钮捕获照片(检查链接图片)

解决方案

我有捕获的图像,而如下扫描QRCode



首先添加 AVCaptureStillImageOutput's

的属性

  @property ,nonatomic)AVCaptureStillImageOutput * stillImageOutput; 



2)添加会话预设在 AVCaptureSession 初始化之后

  [self.session setSessionPreset:AVCaptureSessionPreset640x480]; 

3)现在添加 AVCaptureStillImageOutput 作为输出 AVCaptureSession

  //准备输出用于快照
self.stillImageOutput = [AVCaptureStillImageOutput new];
[self.session addOutput:self.stillImageOutput];
self.stillImageOutput.outputSettings = @ {AVVideoCodecKey:AVVideoCodecJPEG};

4)添加以下代码以在委托方法中捕获扫描图像 captureOutput:didOutputMetadataObjects:fromConnection:connection

  __block UIImage * scannedImg = nil; 
//获取面部图像并传递给CoreImage进行检测
AVCaptureConnection * stillConnection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[self.stillImageOutput captureStillImageAsynchronfullyFromConnection:stillConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer,NSError * error){
if(error){
NSLog(@有一个问题);
return;
}

NSData * jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];

scannedImg = [UIImage imageWithData:jpegData];
NSLog(@scannedImg:%@,scannedImg);
}};

对于参考,请使用 CodeScanViewController



它是@Enjoy


I have following code to open the camera in UIView ,that is working right now.

But i have two buttons like in this screen shot one for capturing photo and another one for upload photo from library.

How do i capture photo without going to native camera ?

Here is my .h file code

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface bgCameraController : UIViewController<AVCaptureMetadataOutputObjectsDelegate>

@property (weak, nonatomic) IBOutlet UIView *cam;
@property (strong, nonatomic) IBOutlet UIImageView *imageView;

- (IBAction)takePhoto:  (UIButton *)sender;
- (IBAction)selectPhoto:(UIButton *)sender;
@end

Here is my .m file code

#import "bgCameraController.h"

@interface bgCameraController ()
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *videoPreviewLayer;
@property (nonatomic, strong) AVAudioPlayer *audioPlayer;
@property (nonatomic) BOOL isReading;

-(BOOL)startReading;
-(void)stopReading;
-(void)loadBeepSound;
@end

@implementation bgCameraController

- (void)viewDidLoad {
    [super viewDidLoad];
    [self loadBeepSound];
    [self startReading];

    // Do any additional setup after loading the view.
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

- (BOOL)startReading {
    NSError *error;

    // Get an instance of the AVCaptureDevice class to initialize a device object and provide the video
    // as the media type parameter.
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Get an instance of the AVCaptureDeviceInput class using the previous device object.
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];

    if (!input) {
        // If any error occurs, simply log the description of it and don't continue any more.
        NSLog(@"%@", [error localizedDescription]);
        return NO;
    }

    // Initialize the captureSession object.
    _captureSession = [[AVCaptureSession alloc] init];
    // Set the input device on the capture session.
    [_captureSession addInput:input];


    // Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
    AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
    [_captureSession addOutput:captureMetadataOutput];

    // Create a new serial dispatch queue.
    dispatch_queue_t dispatchQueue;
    dispatchQueue = dispatch_queue_create("myQueue", NULL);
    [captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
    [captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];

    // Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
    _videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
    [_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    [_videoPreviewLayer setFrame:_cam.layer.bounds];
    [_cam.layer addSublayer:_videoPreviewLayer];


    // Start video capture.
    [_captureSession startRunning];

    return YES;
}


-(void)stopReading{
    // Stop video capture and make the capture session object nil.
    [_captureSession stopRunning];
    _captureSession = nil;

    // Remove the video preview layer from the viewPreview view's layer.
    //[_videoPreviewLayer removeFromSuperlayer];
}


-(void)loadBeepSound{
    // Get the path to the beep.mp3 file and convert it to a NSURL object.
    NSString *beepFilePath = [[NSBundle mainBundle] pathForResource:@"beep" ofType:@"mp3"];
    NSURL *beepURL = [NSURL URLWithString:beepFilePath];

    NSError *error;

    // Initialize the audio player object using the NSURL object previously set.
    _audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:beepURL error:&error];
    if (error) {
        // If the audio player cannot be initialized then log a message.
        // NSLog(@"Could not play beep file.");
        //NSLog(@"%@", [error localizedDescription]);
    }
    else{
        // If the audio player was successfully initialized then load it in memory.
        [_audioPlayer prepareToPlay];
    }
}
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection{

    // Check if the metadataObjects array is not nil and it contains at least one object.
    if (metadataObjects != nil && [metadataObjects count] > 0) {
        // Get the metadata object.
        // NSLog(@"%@",metadataObjects);
        AVMetadataMachineReadableCodeObject *metadataObj = [metadataObjects objectAtIndex:0];
        if ([[metadataObj type] isEqualToString:AVMetadataObjectTypeQRCode]) {
            // If the found metadata is equal to the QR code metadata then update the status label's text,
            // stop reading and change the bar button item's title and the flag's value.
            // Everything is done on the main thread.
            NSString *result=[metadataObj stringValue];
            [self performSelectorOnMainThread:@selector(setQRcodeValues:) withObject:result waitUntilDone:NO];
            //  [_result performSelectorOnMainThread:@selector(setText:) withObject:[metadataObj stringValue] waitUntilDone:NO];

            [self performSelectorOnMainThread:@selector(stopReading) withObject:nil waitUntilDone:NO];
            // [_button performSelectorOnMainThread:@selector(setTitle:) withObject:@"Start!" waitUntilDone:NO];

            _isReading = NO;

            // If the audio player is not nil, then play the sound effect.
            if (_audioPlayer) {
                [_audioPlayer play];
            }
        }
    }


}

/*
#pragma mark - Navigation

// In a storyboard-based application, you will often want to do a little preparation before navigation
- (void)prepareForSegue:(UIStoryboardSegue *)segue sender:(id)sender {
    // Get the new view controller using [segue destinationViewController].
    // Pass the selected object to the new view controller.
}
*/

@end

Please help me... to capture photo by clicking that button (check the link image)

解决方案

I have captured image while scanning QRCode like this:

1) Firstly add property of AVCaptureStillImageOutput's

@property (strong, nonatomic) AVCaptureStillImageOutput *stillImageOutput;

2) Add session preset in AVCaptureSession after initializing it

[self.session setSessionPreset:AVCaptureSessionPreset640x480];

3) Now add AVCaptureStillImageOutput's as output in AVCaptureSession

// Prepare an output for snapshotting
self.stillImageOutput = [AVCaptureStillImageOutput new];
[self.session addOutput:self.stillImageOutput];
self.stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG};

4) Make add below code to capture scanned image in delegate method captureOutput:didOutputMetadataObjects:fromConnection:connection

 __block UIImage *scannedImg = nil;
// Take an image of the face and pass to CoreImage for detection
AVCaptureConnection *stillConnection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:stillConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
    if(error) {
        NSLog(@"There was a problem");
        return;
    }

    NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];

    scannedImg = [UIImage imageWithData:jpegData];
    NSLog(@"scannedImg : %@",scannedImg);
}];

For reference use CodeScanViewController

Thats it @Enjoy

这篇关于如何用AVFoundation框架捕获图像?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆