使用AVFoundation从视频捕获图片 [英] Capture picture from video using AVFoundation

查看:136
本文介绍了使用AVFoundation从视频捕获图片的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想从iPad上的影片中撷取图片。我使用苹果的AVCam示例作为起点。



我可以看到我的应用程序中的视频,并从中拍照。我的问题是结果图像的像素大小是错误的。我想要一张全屏图片(1024x768),但我得到一个较小的(1024x720)。



这些是我的实例变量:

  @property(retain)AVCaptureStillImageOutput * stillImageOutput; 
@property(retain)AVCaptureVideoPreviewLayer * previewLayer;
@property(retain)AVCaptureSession * captureSession;
@property(retain)AVCaptureConnection * captureConnection;
@property(retain)UIImage * stillImage;

这里是拍照的代码:



<$ pre> - (void)takePicture
{
AVCaptureConnection * videoConnection = nil;
for [AVCaptureConnection * connection in [[self stillImageOutput] connections]){
for [AVCaptureInputPort * port in [connection inputPorts]){
if([[media mediaType] isEqual:AVMediaTypeVideo] ){
videoConnection = connection;
break;
}
}
if(videoConnection){
break;
}
}
NSLog(@要请求捕获:%@,[self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer,NSError * error){
CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer,kCGImagePropertyExifDictionary,NULL);
if(exifAttachments){
NSLog(@attachements:%@,exifAttachments);
} else {
NSLog(@no attachments);
}
NSData * imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage * image = [[UIImage alloc] initWithData:imageData];
[self setStillImage:image];
[image release];
[[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
}];
}

我想到调整最后的图片大小,但这个解决方案会降低图片。
我也意识到 CFDictionaryRef exifAttachments 字典包含一个值 PixelYDimension = 720; 但我不能似乎找到一种与它互动的方式。



任何帮助将非常感谢。
提前感谢并有美好的一天,



Alex。



:我想指出,当我说从视频拍摄照片时,我的意思是该视频来自iPad的相机,不是录制。

解决方案

我找到了我的问题的解决方案。



要使用AVFoundation与相机进行交互,我们需要启动 AVCaptureSession 变量。
完成后,我们可以修改 sessionPreset ,指示输出的质量级别或比特率。
有一个不同常数的集合。要拍摄1024x768的照片,我使用 AVCaptureSessionPresetPhoto


I am trying to capture a picture from a video on my iPad. I used Apple's AVCam example as a starting point.

I was able to see the video in my application and to take pictures from it. My problem is that the pixel size of the result image is wrong. I want a fullscreen picture (1024x768) but I get a smaller one (1024x720).

Those are my instance variables:

@property (retain) AVCaptureStillImageOutput *stillImageOutput;
@property (retain) AVCaptureVideoPreviewLayer *previewLayer;
@property (retain) AVCaptureSession *captureSession;
@property (retain) AVCaptureConnection *captureConnection;
@property (retain) UIImage *stillImage;

Here the code to take pictures:

- (void)takePicture
{
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) {
            break;
        }
    }
    NSLog(@"about to request a capture from: %@", [self stillImageOutput]);
    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
                                                         completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
                                                             CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
                                                             if (exifAttachments) {
                                                                 NSLog(@"attachements: %@", exifAttachments);
                                                             } else {
                                                                 NSLog(@"no attachments");
                                                             }
                                                             NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
                                                             UIImage *image = [[UIImage alloc] initWithData:imageData];
                                                             [self setStillImage:image];
                                                             [image release];
                                                             [[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
                                                         }];
}

I thought of resizing the final picture but this solution would decrease the quality of the image. I also realized that the CFDictionaryRef exifAttachments dictionary contains a value PixelYDimension = 720; but I can't seem to find a way to interact with it.

Any help would be very much appreciated. Thank you in advance and have a nice day,

Alex.

EDIT: I'd like to point out that when I say "taking a picture from a video" I meant that the video is coming live from the iPad's camera and it is not a recording.

解决方案

I found a solution to my problem. Here it goes in case someone looks for this in the future.

To interact with the camera using AVFoundation we need to initiate a AVCaptureSession variable. After doing this we can modify the sessionPreset indicating the quality level or bitrate of the output. There are a set of different constants. To take 1024x768 pictures I used AVCaptureSessionPresetPhoto

这篇关于使用AVFoundation从视频捕获图片的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆