获取屏幕上某种颜色所占据的区域 - iOS [英] Getting the area occupied by a certain color onscreen - iOS

查看:16
本文介绍了获取屏幕上某种颜色所占据的区域 - iOS的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试做一些类似于 这个问题,但我真的不明白该问题的答案,我不确定它是否是我需要的.

I'm trying to do something similar to what is asked in this question, but I don't really understand the answer given to that question and I'm not sure if it is what I need.

我需要的是简单的,虽然我不太确定这是否容易.我想计算屏幕上某种颜色的像素数.我知道我们看到的每个像素"实际上都是不同颜色的像素的组合,这些像素看起来是绿色的.所以我需要的是 实际 颜色 - 用户看到的颜色.

What I need is simple, though I'm not so sure it's easy. I want to calculate the number of pixels on the screen that are a certain color. I understand that each 'pixel' that we see is actually a combination of pixels of different colors that appear to be, say, green. So what I need is that actual color- the one that the user sees.

比如我创建了一个UIView,设置背景色为[UIColor greenColor],并将其尺寸设置为屏幕面积的一半(我们可以假设状态栏是为简单起见,我们在 iPhone 上隐藏),我希望这种魔法方法"返回 240 * 160 或 38,400 - 屏幕面积的一半.

For example, if I created a UIView, set the background color to [UIColor greenColor], and set its dimensions to half of the area of the screen (we can assume that the status bar is hidden for simplicity and that we are on an iPhone), I would expect this 'magic method' to return 240 * 160 or 38,400- half the area of the screen.

我不希望有人写出这种神奇的方法",但我想知道

I don't expect anyone to write out this 'magic method,' but I'd like to know

a) 如果可能

b) 如果是这样,如果它几乎实时完成

b) If so, if it be done in almost-realtime

c) 如果是这样,从哪里开始.我听说可以用 OpenGL 来完成,但我没有这方面的经验.

c) If so again, where to start. I've heard it can be done with OpenGL, but I have no experience in that area.

这是我的解决方案,感谢 Radif Sharafullin:

Here is my solution, thanks to Radif Sharafullin:

int pixelsFromImage(UIImage *inImage) {
    CGSize s = inImage.size;
    const int width = s.width;
    const int height = s.height;
    unsigned char* pixelData = malloc(width * height);

    int pixels = 0;

    CGContextRef context = CGBitmapContextCreate(pixelData,
                                                  width,            
                                                  height,            
                                                  8,           
                                                  width,            
                                                  NULL,            
                                                  kCGImageAlphaOnly);

    CGContextClearRect(context, CGRectMake(0, 0, width, height));

    CGContextDrawImage(context, CGRectMake(0, 0, width, height), inImage.CGImage );

    CGContextRelease(context);

    for(int idx = 0; idx < width * height; ++idx) {
        if(pixelData[idx]) {
            ++pixels;
        }
    }

    free(pixelData);

    return pixels;
}

推荐答案

这是可能的.我做了一些类似的事情来计算透明像素的百分比,但由于我需要粗略估计,我没有查看每个像素,而是每十个像素,下面代码中的 step 变量.

it is possible. I've done something similar to calculate the percentage of transparent pixels, but since I needed the rough estimate, I was not looking at each pixel but at every tenth pixel, step variable in the code below.

BOOL isImageErased(UIImage *inImage, float step, int forgivenessCount){
CGSize s = inImage.size;
int width = s.width;  
int height = s.height;   
unsigned char*  pixelData = malloc( width * height );  
int forgivenessCounter=0;


CGContextRef context = CGBitmapContextCreate ( pixelData,  
                                              width,            
                                              height,            
                                              8,           
                                              width,            
                                              NULL,            
                                              kCGImageAlphaOnly );   
CGContextClearRect(context, CGRectMake(0, 0, width, height));
CGContextDrawImage( context, CGRectMake(0, 0, width, height), inImage.CGImage );  
CGContextRelease( context );  
for (int x=0; x<width; x=x+step) {
    for (int y=0; y<height; y=y+step) {

        if(pixelData[y * width + x]) {
            forgivenessCounter++;
            if (forgivenessCounter==forgivenessCount) {
                free(pixelData);
                return FALSE;
            }

        };

    }
}



free( pixelData );

return TRUE;}

如果您传递预处理的灰度图像或修改 API 的 kCGImageAlphaOnly 设置,我相信此代码可用于您的目的.

I believe this code can be used for your purpose if you pass a preprocessed grayscaled image or modify the kCGImageAlphaOnly setting of the API.

希望有帮助

这篇关于获取屏幕上某种颜色所占据的区域 - iOS的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆