iOS:从前置摄像头捕获图像 [英] iOS: Capture image from front facing camera
问题描述
我正在做一个应用程序,我想从前置摄像头捕获图像,而不显示任何捕获屏幕。我想在没有任何用户交互的代码中完全拍照。
如何使用AVFoundation前置摄像头捕获图像:/ p>
开发注意事项:
- 仔细检查应用程式和图片方向设定
- AVFoundation及其相关框架是讨厌的庞然大物,很难理解/实现。我已经使我的代码尽可能精简,但请查看这个优秀的教程,以获得更好的解释(网站不再可用,通过archive.org链接):
http://www.benjaminloulier.com / posts / ios4-and-direct-access-to-the-camera
ViewController.h
//框架
#import< CoreVideo / CoreVideo.h>
#import< CoreMedia / CoreMedia.h>
#import< AVFoundation / AVFoundation.h>
#import< UIKit / UIKit.h>
@interface CameraViewController:UIViewController< AVCaptureVideoDataOutputSampleBufferDelegate>
//相机
@property(弱,非原子)IBOutlet UIImageView * cameraImageView;
@property(strong,nonatomic)AVCaptureDevice * device;
@property(strong,nonatomic)AVCaptureSession * captureSession;
@property(strong,nonatomic)AVCaptureVideoPreviewLayer * previewLayer;
@property(strong,nonatomic)UIImage * cameraImage;
@end
ViewController.m
#importCameraViewController.h
@implementation CameraViewController
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCamera];
[self setupTimer];
}
- (void)setupCamera
{
NSArray * devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for(AVCaptureDevice * device in devices)
{
if([device position] == AVCaptureDevicePositionFront)
self.device = device;
}
AVCaptureDeviceInput * input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
AVCaptureVideoDataOutput * output = [[AVCaptureVideoDataOutput alloc] init];
output.alwaysDiscardsLateVideoFrames = YES;
dispatch_queue_t queue;
queue = dispatch_queue_create(cameraQueue,NULL);
[output setSampleBufferDelegate:self queue:queue];
NSString * key =(NSString *)kCVPixelBufferPixelFormatTypeKey;
NSNumber * value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary * videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[output setVideoSettings:videoSettings];
self.captureSession = [[AVCaptureSession alloc] init]
[self.captureSession addInput:input];
[self.captureSession addOutput:output];
[self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
//检查您的APP
self.previewLayer.frame = CGRectMake(0,0,self.view.frame.size.height,self.view.frame.size.width) ;
self.previewLayer.orientation = AVCaptureVideoOrientationLandscapeRight;
//检查您的应用程序
[self.view.layer insertSublayer:self.previewLayer atIndex:0]; //注释掉隐藏预览图层
[self.captureSession startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer (sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t * baseAddress =(uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress,width,height,8,bytesPerRow,colorSpace,kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
self.cameraImage = [UIImage imageWithCGImage:newImage scale:1.0f orientation:UIImageOrientationDownMirrored];
CGImageRelease(newImage);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
}
- (void)setupTimer
{
NSTimer * cameraTimer = [NSTimer scheduledTimerWithTimeInterval:2.0f target:self selector:@selector(snapshot)userInfo:nil重复:YES];
}
- (void)snapshot
{
NSLog(@SNAPSHOT);
self.cameraImageView.image = self.cameraImage; //注释掉以隐藏快照
}
@end
连接到一个UIViewController与UIImageView的快照,它会工作!在没有任何用户输入的情况下以2.0秒间隔以编程方式进行快照。注释掉所选行以删除预览图层和快照反馈。
任何其他问题/意见,请让我知道!
I am making an application where I would like to capture an image from the front facing camera, without presenting a capture screen of any kind. I want to take a picture completely in code without any user interaction. How would I do this for the front facing camera?
How to capture an image using the AVFoundation front-facing camera:
Development Caveats:
- Check your app and image orientation settings carefully
- AVFoundation and its associated frameworks are nasty behemoths and very difficult to understand/implement. I've made my code as lean as possible, but please check out this excellent tutorial for a better explanation (website not available any more, link via archive.org): http://www.benjaminloulier.com/posts/ios4-and-direct-access-to-the-camera
ViewController.h
// Frameworks
#import <CoreVideo/CoreVideo.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
#import <UIKit/UIKit.h>
@interface CameraViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
// Camera
@property (weak, nonatomic) IBOutlet UIImageView* cameraImageView;
@property (strong, nonatomic) AVCaptureDevice* device;
@property (strong, nonatomic) AVCaptureSession* captureSession;
@property (strong, nonatomic) AVCaptureVideoPreviewLayer* previewLayer;
@property (strong, nonatomic) UIImage* cameraImage;
@end
ViewController.m
#import "CameraViewController.h"
@implementation CameraViewController
- (void)viewDidLoad
{
[super viewDidLoad];
[self setupCamera];
[self setupTimer];
}
- (void)setupCamera
{
NSArray* devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for(AVCaptureDevice *device in devices)
{
if([device position] == AVCaptureDevicePositionFront)
self.device = device;
}
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc] init];
output.alwaysDiscardsLateVideoFrames = YES;
dispatch_queue_t queue;
queue = dispatch_queue_create("cameraQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
NSString* key = (NSString *) kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[output setVideoSettings:videoSettings];
self.captureSession = [[AVCaptureSession alloc] init];
[self.captureSession addInput:input];
[self.captureSession addOutput:output];
[self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
// CHECK FOR YOUR APP
self.previewLayer.frame = CGRectMake(0, 0, self.view.frame.size.height, self.view.frame.size.width);
self.previewLayer.orientation = AVCaptureVideoOrientationLandscapeRight;
// CHECK FOR YOUR APP
[self.view.layer insertSublayer:self.previewLayer atIndex:0]; // Comment-out to hide preview layer
[self.captureSession startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
CGContextRelease(newContext);
CGColorSpaceRelease(colorSpace);
self.cameraImage = [UIImage imageWithCGImage:newImage scale:1.0f orientation:UIImageOrientationDownMirrored];
CGImageRelease(newImage);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
}
- (void)setupTimer
{
NSTimer* cameraTimer = [NSTimer scheduledTimerWithTimeInterval:2.0f target:self selector:@selector(snapshot) userInfo:nil repeats:YES];
}
- (void)snapshot
{
NSLog(@"SNAPSHOT");
self.cameraImageView.image = self.cameraImage; // Comment-out to hide snapshot
}
@end
Connect this up to a UIViewController with a UIImageView for the snapshot and it'll work! Snapshots are taken programmatically at 2.0 second intervals without any user input. Comment out the selected lines to remove the preview layer and snapshot feedback.
Any more questions/comments, please let me know!
这篇关于iOS:从前置摄像头捕获图像的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!