从相机面部检测iOS [英] face detection iOS from camera

查看:135
本文介绍了从相机面部检测iOS的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我收到图片视图

-(void)imagePickerController:(UIImagePickerController *)picker
didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSString *mediaType = info[UIImagePickerControllerMediaType];

    [self dismissViewControllerAnimated:YES completion:nil];

    if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {
        UIImage *image = info[UIImagePickerControllerOriginalImage];

        //imgvprofileImage.image = image;
        //[self detectForFacesInUIImage:[UIImage imageNamed:@"image00.jpg"]];

        [self detectForFacesInUIImage:image];
    }
    else if ([mediaType isEqualToString:(NSString *)kUTTypeMovie])
    {
        // Code here to support video if enabled
    }
}

当我发送这样的照片时

[self detectForFacesInUIImage:[UIImage imageNamed:@"image00.jpg"]]; 

检测效果很好并找到一张脸但是当我使用从相机返回的图像时它没有'工作。

The detection works well and finds a face but when I use the image returned from the camera it doesn't work.

 [self detectForFacesInUIImage:image]

这是我用来检测脸部的功能

This is the function i use to detect the face

-(void)detectForFacesInUIImage:(UIImage *)facePicture
{
    CIImage* image = [CIImage imageWithCGImage:facePicture.CGImage];

    CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyLow forKey:CIDetectorAccuracy]];

    NSArray* features = [detector featuresInImage:image];

    if (features.count == 0) {
        NSLog(@"There is no faces in captured image ") ;
    }

    for(CIFaceFeature* faceObject in features)
    {
        CGRect modifiedFaceBounds = faceObject.bounds;
        modifiedFaceBounds.origin.y = facePicture.size.height-faceObject.bounds.size.height-faceObject.bounds.origin.y;

        [self addSubViewWithFrame:facePicture toRect:modifiedFaceBounds] ;
    }
}


推荐答案

问题处于图像方向。

不记得我把它拿到了哪里,但它确实有效:

Can't remember where I took this, but it works:

- (void) detectForFaces:(CGImageRef)facePicture orientation:(UIImageOrientation)orientation {


    CIImage* image = [CIImage imageWithCGImage:facePicture];

    CIContext *context = [CIContext contextWithOptions:nil];                    // 1
    NSDictionary *opts = @{ CIDetectorAccuracy : CIDetectorAccuracyLow };      // 2
    CIDetector *detector = [CIDetector detectorOfType:CIDetectorTypeFace
                                              context:context
                                              options:opts];                    // 3

    int exifOrientation;
    switch (orientation) {
        case UIImageOrientationUp:
            exifOrientation = 1;
            break;
        case UIImageOrientationDown:
            exifOrientation = 3;
            break;
        case UIImageOrientationLeft:
            exifOrientation = 8;
            break;
        case UIImageOrientationRight:
            exifOrientation = 6;
            break;
        case UIImageOrientationUpMirrored:
            exifOrientation = 2;
            break;
        case UIImageOrientationDownMirrored:
            exifOrientation = 4;
            break;
        case UIImageOrientationLeftMirrored:
            exifOrientation = 5;
            break;
        case UIImageOrientationRightMirrored:
            exifOrientation = 7;
            break;
        default:
            break;
    }


    opts = @{ CIDetectorImageOrientation :[NSNumber numberWithInt:exifOrientation
                                           ] };

    NSArray *features = [detector featuresInImage:image options:opts];

    if ([features count] > 0) {
        CIFaceFeature *face = [features lastObject];
        NSLog(@"%@", NSStringFromCGRect(face.bounds));
    }
}



如何使用:


How to use:

UIImage *image = // some image here;
[self detectForFaces:image.CGImage orientation:image.imageOrientation];

这篇关于从相机面部检测iOS的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆