使用AVCaptureSession sessionPreset = AVCaptureSessionPresetPhoto拉伸捕获的照片 [英] Captured photo is stretched with AVCaptureSession sessionPreset = AVCaptureSessionPresetPhoto

查看:108
本文介绍了使用AVCaptureSession sessionPreset = AVCaptureSessionPresetPhoto拉伸捕获的照片的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

重要提示:如果我使用: session.sessionPreset = AVCaptureSessionPresetHigh; 我的预览图片未被拉伸!!如果我将照片保存到设备 UIImageWriteToSavedPhotosAlbum(image,nil,nil,nil); 图像正常,仅在预览中拉伸。

IMPORTANT: if I use: session.sessionPreset = AVCaptureSessionPresetHigh; my preview image is not stretched !! If I save the photo to the device UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil); the image is normal, only in the preview it is stretched.

我正在使用AVFoundation拍摄照片。

I m using AVFoundation to capture photo.

session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;

CALayer *viewLayer = vImagePreview.layer;
NSLog(@"viewLayer = %@", viewLayer);

AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];

captureVideoPreviewLayer.frame = vImagePreview.bounds;
[vImagePreview.layer addSublayer:captureVideoPreviewLayer];

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
    // Handle the error appropriately.
    NSLog(@"ERROR: trying to open camera: %@", error);
}
[session addInput:input];

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];

我将sessionPreset设置为 AVCaptureSessionPresetPhoto

I set the sessionPreset to AVCaptureSessionPresetPhoto:

session.sessionPreset = AVCaptureSessionPresetPhoto;

我的捕获方法:

-(void)captureNow {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections)
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection)
        {
            break;
        }
    }

    NSLog(@"about to request a capture from: %@", stillImageOutput);
    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
     {
         CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
         if (exifAttachments)
         {
             // Do something with the attachments.
             NSLog(@"attachements: %@", exifAttachments);
         } else {
             NSLog(@"no attachments");
         }

         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
         UIImage *image = [[UIImage alloc] initWithData:imageData];
         NSLog(@"%@",NSStringFromCGSize(image.size));
         [self animateUpTheImageWithImage:image];


     }];

}

我在哪里添加拍摄照片的预览:

Where I add the captured photo's preview:

- (void) animateUpTheImageWithImage:(UIImage*)theImage{

    UIView* preview = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height/*426*/)];
    CALayer *previewLayer = preview.layer;
    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    captureVideoPreviewLayer.frame = previewLayer.frame;
    [previewLayer addSublayer:captureVideoPreviewLayer];
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspect;

    [self addSubview:preview];

}

结果是我拍摄的图像被拉伸了!

And the result is my captured image is stretched !

推荐答案

所以我解决了我的问题。这是我现在使用的代码,它工作正常:

So I resolved my problem. Here is the code what I'm using now, and it s working fine:

    session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPresetHigh;

    captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    captureVideoPreviewLayer.frame = vImagePreview.bounds;
    [vImagePreview.layer addSublayer:captureVideoPreviewLayer];

    device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

...

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[session addOutput:stillImageOutput];

capturedImageView = [[UIView alloc]initWithFrame:CGRectMake(0, 0, self.screenWidth,self.screenHeight)];
[self addSubview:capturedImageView];

重要的输出imagaView:

and important the output imagaView:

vImage = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0, self.screenWidth,self.screenHeight)];
    [capturedImageView addSubview:vImage];
    vImage.autoresizingMask = (UIViewAutoresizingFlexibleBottomMargin|UIViewAutoresizingFlexibleHeight|UIViewAutoresizingFlexibleLeftMargin|UIViewAutoresizingFlexibleRightMargin|UIViewAutoresizingFlexibleTopMargin|UIViewAutoresizingFlexibleWidth);
    vImage.contentMode = UIViewContentModeScaleAspectFill;
    vImage.image = theImage;

一些额外的信息:

相机层必须全屏(你可以修改它的y坐标,但宽度和高度必须是全尺寸),并且outputImageView也必须是。

camera layer must be full screen ( You can modify its y coordinate but width and height must be full size ) and the outputImageView must be too.

我希望这些将是有用的信息也有人。

I hope these will be useful infomations for somebody too.

这篇关于使用AVCaptureSession sessionPreset = AVCaptureSessionPresetPhoto拉伸捕获的照片的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆