IPhone SDK:相机访问? [英] IPhone SDK: Camera access?

查看:139
本文介绍了IPhone SDK:相机访问?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想知道如何访问iphones相机并实时处理它:例如只需在相机视图上绘制。

I want to know how to access the iphones camera and work with it in realtime: for example just draw on the camera view.

另一个相关问题:

Another related Question:

我可以在Mac上的Photo Booth中同时显示 4个摄像机视图

Can I display 4 camera-views at once like in "Photo Booth" on the Mac.

推荐答案

您可以使用AVFoundation

You can do it by using AVFoundation

- (void)initCapture {

    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput 
                                          deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] 
                                          error:nil];

    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];

    captureOutput.alwaysDiscardsLateVideoFrames = YES; 

    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    [captureOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
    [captureOutput setVideoSettings:videoSettings]; 


    self.captureSession = [[AVCaptureSession alloc] init];
    [self.captureSession setSessionPreset:AVCaptureSessionPresetLow];

    [self.captureSession addInput:captureInput];
    [self.captureSession addOutput:captureOutput];

    [self.captureSession startRunning];

    self.customLayer = [CALayer layer];

    self.customLayer.frame =CGRectMake(5-25,25, 200,150);

    self.customLayer.transform = CATransform3DRotate(CATransform3DIdentity, M_PI/2.0f, 0, 0, 1);

    //self.customLayer.transform =CATransform3DMakeRotation(M_PI/2.0f, 0, 0, 1);


    //[self.view.layer addSublayer:imageView.layer];
    //self.customLayer.frame =CGRectMake(0, 0, 200,150);
    //self.customLayer.contentsGravity = kCAGravityResizeAspectFill;

    [self.view.layer insertSublayer:self.customLayer atIndex:4];
    //[self.view.layer addSublayer:self.customLayer];


    self.customLayer1 = [CALayer layer];
    //self.customLayer.frame = self.view.bounds;
    self.customLayer1.frame =CGRectMake(165-25, 25, 200, 150);
    self.customLayer1.transform = CATransform3DRotate(CATransform3DIdentity, M_PI/2.0f, 0, 0, 1);
    //self.customLayer1.contentsGravity = kCAGravityResizeAspectFill;
    [self.view.layer addSublayer:self.customLayer1];




    self.customLayer2 = [CALayer layer];
    //self.customLayer.frame = self.view.bounds;
    self.customLayer2.frame =CGRectMake(5-25, 210 +25, 200, 150);
    self.customLayer2.transform = CATransform3DRotate(CATransform3DIdentity, M_PI/2.0f, 0, 0, 1);
    //self.customLayer1.contentsGravity = kCAGravityResizeAspectFill;
    [self.view.layer addSublayer:self.customLayer2];


    self.customLayer3 = [CALayer layer];
    //self.customLayer.frame = self.view.bounds;
    self.customLayer3.frame =CGRectMake(165-25, 210 +25, 200, 150);
    self.customLayer3.transform = CATransform3DRotate(CATransform3DIdentity, M_PI/2.0f, 0, 0, 1);
    //self.customLayer1.contentsGravity = kCAGravityResizeAspectFill;
    [self.view.layer addSublayer:self.customLayer3];



}



#pragma mark -
#pragma mark AVCaptureSession delegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection 
{ 


    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    /*Lock the image buffer*/
    CVPixelBufferLockBaseAddress(imageBuffer,0); 
    /*Get information about the image*/
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer);  


    /*Create a CGImageRef from the CVImageBufferRef*/
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 



    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage2 = CGBitmapContextCreateImage(newContext); 
    /*We release some components*/
    CGContextRelease(newContext); 
    CGColorSpaceRelease(colorSpace);

    [self.customLayer performSelectorOnMainThread:@selector(setContents:) withObject: (id) newImage2 waitUntilDone:YES];
    [self.customLayer1 performSelectorOnMainThread:@selector(setContents:) withObject: (id) newImage2 waitUntilDone:YES];
    [self.customLayer2 performSelectorOnMainThread:@selector(setContents:) withObject: (id) newImage2 waitUntilDone:YES];
    [self.customLayer3 performSelectorOnMainThread:@selector(setContents:) withObject: (id) newImage2 waitUntilDone:YES];


    //  UIImage *image= [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationRight];


    /*We relase the CGImageRef*/
    CGImageRelease(newImage2);

    //  [self.imageView performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];

    /*We unlock the  image buffer*/
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    [pool drain];

} 

它工作正常..

http:/ /crayoncoding.blogspot.com/2011/04/iphone-4-camera-views-at-once.html

有关详情代码,请参阅上述链接

see the above link for detail code

这篇关于IPhone SDK:相机访问?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆