如何使用UIImagePickerController正确裁剪UIImage? [英] How to do properly cropping of UIImage taken with UIImagePickerController?

查看:1638
本文介绍了如何使用UIImagePickerController正确裁剪UIImage?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是我的应用的相机覆盖,

This is camera overlay for my app,

黄色方块表示用户只保存此部分(相机内)的照片。这就像裁剪一样。

The yellow square is to indicate user that only photo inside this part (in camera) will be saved. It's like crop.

当我保存捕获图像时,它会保存缩放的照片[放大照片],

When I saved that capture image, it'll save zoomed photo [a big zoomed on photo],

我发现的是,当我拍照时,它的大小 {2448,3264}

What I found is, when I took a photo, it'll be of size of {2448, 3264}

我裁剪这样的图像,

- (UIImage *)imageByCroppingImage:(UIImage *)image toSize:(CGSize)size
{
    double x = (image.size.width - size.width) / 2.0;
    double y = (image.size.height - size.height) / 2.0;

    CGRect cropRect = CGRectMake(x, y, size.height, size.width);
    CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], cropRect);

    UIImage *cropped = [UIImage imageWithCGImage:imageRef];
    CGImageRelease(imageRef);

    return cropped;
}

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    UIImage *image = [info valueForKey:UIImagePickerControllerOriginalImage];

    if (image) {
        UIImage *newImage = [self imageByCroppingImage:image toSize:CGSizeMake(300.f, 300.f)];
        UIImageWriteToSavedPhotosAlbum(newImage, nil, nil, nil);
    }
}

注意,

相机上的黄色方块大小与宽度= 300和高度= 300相同。

That yellow square on camera is also same size that's width=300 and height=300.

如果我将前置摄像头设置为 UIImagePickerController 然后它会给我完美的裁剪图像输出。 是的,这真的很奇怪!

If I'll set front camera for UIImagePickerController then it'll give me perfect output of cropped image. Yes this is really strange!

我已经尝试过这里的所有内容,裁剪UIImage 。即使 https://github.com/Nyx0uf/NYXImagesKit 不会帮助。

I've tried everything from here, Cropping an UIImage. Even https://github.com/Nyx0uf/NYXImagesKit won't help.

有任何想法/建议吗?

更新:

从这个问题来看,试图将我的UIImage裁剪为1:1的宽高比(正方形),但它不断扩大图像导致它模糊。为什么?

我跟着这样的@DrummerB的答案,

I followed the answer of @DrummerB like this,

    CGFloat originalWidth = image.size.width * image.scale;
    CGFloat originalHeight = image.size.height * image.scale;
    float smallestDimension = fminf(originalWidth, originalHeight);
    CGRect square = CGRectMake(0, 0, smallestDimension, smallestDimension);
    CGImageRef imageRef = CGImageCreateWithImageInRect([image CGImage], square);
    UIImage *squareImage = [UIImage imageWithCGImage:imageRef scale:image.scale orientation:image.imageOrientation];
    UIImageWriteToSavedPhotosAlbum(squareImage, nil, nil, nil);
    CGImageRelease(imageRef);

这是我拍摄的,

这导致我以下,


现在我得到方形照片,但在输出中注意,我仍然在黄色方块外得到照片。我想要的是获得居住在黄色方块中的照片。捕获的图像仍然是大小,{w = 2448,h = 3264}。请注意,红色圆圈表示图像的外部部分不应包含在输出中,因为该部分不在黄色方块内。

Now I'm getting the square photo, but note in output, still I'm getting photo outside that yellow square. What I want is to get photo which is reside in yellow square. Captured image is still of size, {w=2448, h=3264}. Note, that red circles which indicate outer part of image which should not include in output as that part is not inside yellow square.

这有什么问题?

推荐答案

看起来您在实施中收到的图像正在返回300 x 300像素的图像裁剪。你在屏幕上的黄色方块是300乘300点。点与像素不同。因此,如果您的照片宽度为3264像素,那么将其裁剪为300像素将返回大约原始尺寸的1/10的图像。

It looks like the image you are receiving in your implementation is returning an image crop of 300 by 300 pixels. The yellow square you have on screen is 300 by 300 points. Points are not the same as pixels. So if your photo 3264 pixels wide, then cropping it to 300 pixels would return an image of about 1/10th the original size.

这篇关于如何使用UIImagePickerController正确裁剪UIImage?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆