在屏幕上获得某种颜色占用的区域 - iOS [英] Getting the area occupied by a certain color onscreen - iOS
问题描述
我正在尝试做类似于这个问题,但我并不理解这个问题的答案,我不确定这是否是我需要的。
I'm trying to do something similar to what is asked in this question, but I don't really understand the answer given to that question and I'm not sure if it is what I need.
我需要的是简单的,虽然我不太确定这很容易。我想计算屏幕上某种颜色的像素数。我知道我们看到的每个像素实际上是不同颜色的像素组合,看起来像是绿色。所以我需要的是实际颜色 - 用户看到的颜色。
What I need is simple, though I'm not so sure it's easy. I want to calculate the number of pixels on the screen that are a certain color. I understand that each 'pixel' that we see is actually a combination of pixels of different colors that appear to be, say, green. So what I need is that actual color- the one that the user sees.
例如,如果我创建了一个UIView,将背景颜色设置为 [UIColor greenColor]
,并设置其尺寸为屏幕区域的一半(我们可以假设状态栏是隐藏的,为了简单而我们在iPhone上),我希望这种神奇的方法能够返回240 * 160或38,400的面积的一半屏幕。
For example, if I created a UIView, set the background color to [UIColor greenColor]
, and set its dimensions to half of the area of the screen (we can assume that the status bar is hidden for simplicity and that we are on an iPhone), I would expect this 'magic method' to return 240 * 160 or 38,400- half the area of the screen.
我不指望有人写出这种'魔术方法',但我想知道
I don't expect anyone to write out this 'magic method,' but I'd like to know
a)如果可能的话
b)如果是,如果几乎实时完成
b) If so, if it be done in almost-realtime
c)如果是这样,从哪里开始。我听说可以用OpenGL完成,但我没有这方面的经验。
c) If so again, where to start. I've heard it can be done with OpenGL, but I have no experience in that area.
这是我的解决方案,感谢Radif Sharafullin:
Here is my solution, thanks to Radif Sharafullin:
int pixelsFromImage(UIImage *inImage) {
CGSize s = inImage.size;
const int width = s.width;
const int height = s.height;
unsigned char* pixelData = malloc(width * height);
int pixels = 0;
CGContextRef context = CGBitmapContextCreate(pixelData,
width,
height,
8,
width,
NULL,
kCGImageAlphaOnly);
CGContextClearRect(context, CGRectMake(0, 0, width, height));
CGContextDrawImage(context, CGRectMake(0, 0, width, height), inImage.CGImage );
CGContextRelease(context);
for(int idx = 0; idx < width * height; ++idx) {
if(pixelData[idx]) {
++pixels;
}
}
free(pixelData);
return pixels;
}
推荐答案
这是可能的。我已经做了类似的计算透明像素的百分比,但由于我需要粗略估计,我没有看每个像素但是每隔十分像素,步
变量在下面的代码中。
it is possible. I've done something similar to calculate the percentage of transparent pixels, but since I needed the rough estimate, I was not looking at each pixel but at every tenth pixel, step
variable in the code below.
BOOL isImageErased(UIImage *inImage, float step, int forgivenessCount){
CGSize s = inImage.size;
int width = s.width;
int height = s.height;
unsigned char* pixelData = malloc( width * height );
int forgivenessCounter=0;
CGContextRef context = CGBitmapContextCreate ( pixelData,
width,
height,
8,
width,
NULL,
kCGImageAlphaOnly );
CGContextClearRect(context, CGRectMake(0, 0, width, height));
CGContextDrawImage( context, CGRectMake(0, 0, width, height), inImage.CGImage );
CGContextRelease( context );
for (int x=0; x<width; x=x+step) {
for (int y=0; y<height; y=y+step) {
if(pixelData[y * width + x]) {
forgivenessCounter++;
if (forgivenessCounter==forgivenessCount) {
free(pixelData);
return FALSE;
}
};
}
}
free( pixelData );
return TRUE;}
我相信此代码可用于您的目的,如果您传递预处理的灰度图像或修改API的 kCGImageAlphaOnly
设置。
I believe this code can be used for your purpose if you pass a preprocessed grayscaled image or modify the kCGImageAlphaOnly
setting of the API.
希望有帮助
这篇关于在屏幕上获得某种颜色占用的区域 - iOS的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!