Objective-C 在没有相机界面的情况下拍摄照片的简单方法.只需从相机中获取照片并保存到文件中 [英] Objective-C Simple way to take a photo without a camera interface. Just get a picture from camera and save to a file

查看:36
本文介绍了Objective-C 在没有相机界面的情况下拍摄照片的简单方法.只需从相机中获取照片并保存到文件中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我找不到没有相机界面的简单拍照方法.我只需要从相机中获取一张照片并将其保存到一个文件中.

I can't find a simple way of taking a photo without a camera interface. I just need to get a picture from the camera and save it to a file.

推荐答案

我用这个代码用前置摄像头拍照.并非所有代码都是我的,但我没有找到原始源代码的链接.此代码还会产生快门声.图像质量不是很好(很暗),所以代码需要一两次调整.

I used this code to take a photo with frontal camera. Not all code is mine but I didn't find a link to original source. This code also produces a shutter sound. Image quality is not very good (it's quite dark) so code needs a tweak or two.

-(void) takePhoto 
{
    AVCaptureDevice *frontalCamera;

    NSArray *allCameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for ( int i = 0; i < allCameras.count; i++ )
    {
        AVCaptureDevice *camera = [allCameras objectAtIndex:i];

        if ( camera.position == AVCaptureDevicePositionFront )
        {
            frontalCamera = camera;
        }
    }

    if ( frontalCamera != nil )
    {
        photoSession = [[AVCaptureSession alloc] init];

        NSError *error;
        AVCaptureDeviceInput *input =
        [AVCaptureDeviceInput deviceInputWithDevice:frontalCamera error:&error];

        if ( !error && [photoSession canAddInput:input] )
        {
            [photoSession addInput:input];

            AVCaptureStillImageOutput *output = [[AVCaptureStillImageOutput alloc] init];

            [output setOutputSettings:
             [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];

            if ( [photoSession canAddOutput:output] )
            {
                [photoSession addOutput:output];

                AVCaptureConnection *videoConnection = nil;

                for (AVCaptureConnection *connection in output.connections)
                {
                    for (AVCaptureInputPort *port in [connection inputPorts])
                    {
                        if ([[port mediaType] isEqual:AVMediaTypeVideo] )
                        {
                            videoConnection = connection;
                            break;
                        }
                    }
                    if (videoConnection) { break; }
                }

                if ( videoConnection )
                {
                    [photoSession startRunning];

                    [output captureStillImageAsynchronouslyFromConnection:videoConnection
                                                        completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

                        if (imageDataSampleBuffer != NULL)
                        {
                            NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                            UIImage *photo = [[UIImage alloc] initWithData:imageData];
                            [self processImage:photo]; //this is a custom method
                        }
                    }];
                }
            }
        }
    }
}

photoSession 是持有 takePhoto 方法的类的 AVCaptureSession * ivar.

photoSession is an AVCaptureSession * ivar of the class holding the takePhoto method.

编辑(调整):如果您将 if (videoConnection) 块更改为下面的代码,您将添加 1 秒延迟并获得良好的图像.

EDIT (tweak): If you change the if ( videoConnection ) block to the code below you will add 1 second delay and get a good image.

if ( videoConnection )
{
    [photoSession startRunning];

    dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC);
    dispatch_after(popTime, dispatch_get_main_queue(), ^(void){

        [output captureStillImageAsynchronouslyFromConnection:videoConnection
                                            completionHandler:^(CMSampleBufferRefimageDataSampleBuffer, NSError *error) {

            if (imageDataSampleBuffer != NULL)
            {
                NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                UIImage *photo = [[UIImage alloc] initWithData:imageData];
                [self processImage:photo];
            }
        }];
    });
}

如果您的应用程序不能接受延迟,您可以将代码分成两部分并在 viewDidAppear(或类似的地方)启动 photoSession,然后立即拍摄快照随时需要 - 通常是在一些用户交互之后.

If lag is not acceptable for you application you could split the code in two parts and start the photoSession at viewDidAppear (or somewhere similar) and simply take an immediate snapshot whenever needed - usually after some user interaction.

dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 0.25 * NSEC_PER_SEC);

也产生了很好的结果 - 所以不需要一整秒的延迟.

also produces a good result - so there is no need for a whole second lag.

请注意,此代码是为使用前置摄像头拍摄照片而编写的 - 如果您需要使用后置摄像头,我相信您会知道如何修改它.

Note that this code is written to take a photo with frontal camera - I'm sure you will know how to mend it if you need to use back camera.

这篇关于Objective-C 在没有相机界面的情况下拍摄照片的简单方法.只需从相机中获取照片并保存到文件中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆