来自iOS相机的亮度 [英] Luminosity from iOS camera

查看:174
本文介绍了来自iOS相机的亮度的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试制作应用程序,我必须像这个应用程序一样计算相机的亮度: http://itunes.apple.com/us/app/megaman-luxmeter/id455660266?mt=8



<我找到了这个文件: http://b2cloud.com。 au / tutorial / getting-luminosity-from-an-ios-camera



但我不知道如何直接适应相机而不是一个图像。这是我的代码:

  Image = [[UIImagePickerController alloc] init]; 
Image.delegate = self;
Image.sourceType = UIImagePickerControllerCameraCaptureModeVideo;
Image.showsCameraControls = NO;
[Image setWantsFullScreenLayout:YES];
Image.view.bounds = CGRectMake(0,0,320,480);
[self.view addSubview:Image.view];

NSArray * dayArray = [NSArray arrayWithObjects:Image,nil];
for(NSString * day in dayArray)
{
for(int i = 1; i< = 2; i ++)
{
UIImage * image = [UIImage imageNamed:[NSString stringWithFormat:@%@%d.png,day,i]];
unsigned char * pixels = [image rgbaPixels];
double totalLuminance = 0.0;
for(int p = 0; p< image.size.width * image.size.height * 4; p + = 4)
{
totalLuminance + = pixels [p] * 0.299 +像素[p + 1] * 0.587 +像素[p + 2] * 0.114;
}
totalLuminance / =(image.size.width * image.size.height);
totalLuminance / = 255.0;
NSLog(@%@(%d)=%f,day,i,totalLuminance);
}
}

以下是问题:



未找到实例方法'-rgbaPixels'(返回类型默认为'id')
&
不兼容的指针类型初始化'unsigned char *',表达式为'id'



非常感谢! =)

解决方案

让我建议一种替代方法,而不是对输入视频帧中每个像素进行昂贵的CPU绑定处理。我的开源 GPUImage 框架内置了一个光度提取器,它使用基于GPU的处理来提供直播来自摄像机的光度读数。



设置它相对容易。您只需分配一个GPUImageVideoCamera实例来表示相机,分配GPUImageLuminosity过滤器,并将后者添加为前者的目标。如果要将摄像头源显示到屏幕,请创建GPUImageView实例并将其添加为GPUImageVideoCamera的另一个目标。



您的光度提取器将使用回调块返回计算时的亮度值。使用以下代码设置此块:

  [(GPUImageLuminosity *)filter setLuminosityProcessingFinishedBlock:^(CGFloat luminosity,CMTime frameTime ){
//用亮度
}做一些事情};

我在这个答案,如果你很好奇。这个提取器在iPhone 4上运行大约6毫秒,用于640x480帧的视频。



你很快就会发现iPhone相机的平均光度是启用自动曝光时,几乎总是约50%。这意味着您需要使用相机元数据中的曝光值来补充光度测量值,以获得任何有意义的亮度测量值。


I'm trying to make an application and i have to calculate the brightness of the camera like this application : http://itunes.apple.com/us/app/megaman-luxmeter/id455660266?mt=8

I found this document : http://b2cloud.com.au/tutorial/obtaining-luminosity-from-an-ios-camera

But i don't know how to adapt it to the camera directly and not an image. Here is my code :

    Image = [[UIImagePickerController alloc] init];
    Image.delegate = self;
    Image.sourceType = UIImagePickerControllerCameraCaptureModeVideo;
    Image.showsCameraControls = NO;
    [Image setWantsFullScreenLayout:YES];
    Image.view.bounds = CGRectMake (0, 0, 320, 480);
    [self.view addSubview:Image.view];

    NSArray* dayArray = [NSArray arrayWithObjects:Image,nil];
    for(NSString* day in dayArray)
    {
        for(int i=1;i<=2;i++)
        {
            UIImage* image = [UIImage imageNamed:[NSString stringWithFormat:@"%@%d.png",day,i]];
            unsigned char* pixels = [image rgbaPixels];
            double totalLuminance = 0.0;
            for(int p=0;p<image.size.width*image.size.height*4;p+=4)
            {
                totalLuminance += pixels[p]*0.299 + pixels[p+1]*0.587 + pixels[p+2]*0.114;
            }
            totalLuminance /= (image.size.width*image.size.height);
            totalLuminance /= 255.0;
            NSLog(@"%@ (%d) = %f",day,i,totalLuminance);
        }
    }

Here are the issues :

"Instance method '-rgbaPixels' not found (return type defaults to 'id')" & "Incompatible pointer types initializing 'unsigned char *' with an expression of type 'id'"

Thanks a lot ! =)

解决方案

Rather than doing expensive CPU-bound processing of each pixel in an input video frame, let me suggest an alternative approach. My open source GPUImage framework has a luminosity extractor built into it, which uses GPU-based processing to give live luminosity readings from the video camera.

It's relatively easy to set this up. You simply need to allocate a GPUImageVideoCamera instance to represent the camera, allocate a GPUImageLuminosity filter, and add the latter as a target for the former. If you want to display the camera feed to the screen, create a GPUImageView instance and add that as another target for your GPUImageVideoCamera.

Your luminosity extractor will use a callback block to return luminosity values as they are calculated. This block is set up using code like the following:

[(GPUImageLuminosity *)filter setLuminosityProcessingFinishedBlock:^(CGFloat luminosity, CMTime frameTime) {
     // Do something with the luminosity
   }];

I describe the inner workings of this luminosity extraction in this answer, if you're curious. This extractor runs in ~6 ms for a 640x480 frame of video on an iPhone 4.

One thing you'll quickly find is that the average luminosity from the iPhone camera is almost always around 50% when automatic exposure is enabled. This means that you'll need to supplement your luminosity measurements with exposure values from the camera metadata to obtain any sort of meaningful brightness measurement.

这篇关于来自iOS相机的亮度的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆