检测图像iOS中的黑色像素 [英] Detect black pixel in image iOS

查看:193
本文介绍了检测图像iOS中的黑色像素的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

到目前为止,我搜索每个像素1通过检查颜色,看看它是否是黑色...如果不是我移动到下一个像素。这是永远的,因为我只能检查约。 100像素每秒(加速我的NSTimer冻结应用程序,因为它不能检查得足够快。)所以是有什么反正我可以只是让Xcode返回所有的像素是黑色的,忽略一切,所以我只需要检查那些像素而不是每个像素。我试图检测一个黑色像素最左边的我的图像。



这是我目前的程式码。

   - (void)viewDidLoad {
timer = [NSTimer scheduledTimerWithTimeInterval:0.01
target:self
selector:@selector(onTick :)
userInfo:nil repetes:YES];
y1 = 0;
x1 = 0;
initialImage = 0;
height1 = 0;
width1 = 0;
}

- (void)onTick:(NSTimer *)timer {
if(initialImage!= 1){
/ *
IMAGE INITIALLY GETS SET HERE ...image2.image = [blah blah blah];出于不公开原因而采取这种做法
* /
initialImage = 1;
}
// image2是我检查像素的图像。
width1 =(int)image2.size.width;
height1 =(int)image2.size.height;
CFDataRef imageData = CGDataProviderCopyData(CGImageGetDataProvider(image2.CGImage));
const UInt32 * pixels =(const UInt32 *)CFDataGetBytePtr(imageData);
if((pixels [(x1 +(y1 * width1))])== 0x000000){// 0x000000是黑色右吗?
NSLog(@black!);
NSLog(@x =%i,x1);
NSLog(@y =%i,y1);
} else {
NSLog(@val:%lu,(pixels [(x1 +(y1 * width1))]));
NSLog(@x =%i,x1);
NSLog(@y =%i,y1);
x1 ++;
if(x1> = width1){
y1 ++;
x1 = 0;
}
}
if(y1> height1){
/ *
我的更新图像代码在此处(每次所有像素都检查
$ b * /
y1 = 0;
x1 = 0;
}


$ b b

此外,如果一个像素真的接近黑色,但不是完全黑色...我可以在某处添加一个误差的边缘,所以它仍然会检测像95%的黑色像素谢谢!

为什么不使用一个double for循环在你的函数循环所有可能的x和y? - 坐标在图像中?当然,这将是waaaay比仅检查最多100像素每秒更快。你会想要在外循环中的x(宽度)坐标和内循环中的y(高度)坐标,使你是从左到右每次有效扫描一列像素,因为你正在尝试找到最左边的黑色像素。



此外,你确定每个像素在你的图像有一个4字节(Uint32)表示?标准位图将具有每像素3个字节。要检查像素是否接近黑色,您只需单独检查像素中的每个字节,并确保它们都小于某个阈值。



编辑:因为你使用的是UIGetScreenImage,所以我假设每个像素有4个字节。

  const UInt8 * pixels = CFDataGetBytePtr(imageData); 
UInt8 blackThreshold = 10; //或某个值接近0
int bytesPerPixel = 4;
for(int x = 0; x< width1; x ++){
for(int y = 0; y< height1; y ++){
int pixelStartIndex =(x + * width1))* bytesPerPixel;
UInt8 alphaVal = pixels [pixelStartIndex]; //可以忽略这个值
UInt8 redVal = pixels [pixelStartIndex + 1];
UInt8 greenVal = pixels [pixelStartIndex + 2];
UInt8 blueVal = pixels [pixelStartIndex + 3];
if(redVal< blackThreshold&& blueVal< blackThreshold&& greenVal< blackThreshold){
//这个像素接近黑色...做一些事情
}
}
}



如果结果是bytesPerPixel为3,然后相应地更改该值,从for循环中删除alphaVal,并从红色,绿色和蓝色值的索引减1。



是UIGetScreenImage被认为是一个私人功能,苹果可能或不可以拒绝您使用。


As of now I am searching every pixel 1 by 1 checking the color and seeing if it's black... if it isn't I move on to the next pixel. This is taking forever as I can only check approx. 100 pixels per second (speeding up my NSTimer freezes the app because it can't check fast enough.) So is there anyway I can just have Xcode return all the pixels that are black and ignore everything else so I only have to check those pixels and not every pixel. I am trying to detect a black pixel furthest to the left on my image.

Here is my current code.

- (void)viewDidLoad {
    timer = [NSTimer scheduledTimerWithTimeInterval: 0.01
                                             target: self
                                           selector:@selector(onTick:)
                                           userInfo: nil repeats:YES];
    y1 = 0;
    x1 = 0;
    initialImage = 0;
    height1 = 0;
    width1 = 0;
}

-(void)onTick:(NSTimer *)timer {
    if (initialImage != 1) {
        /*
        IMAGE INITIALLY GETS SET HERE... "image2.image = [blah blah blah];" took this out for non disclosure reasons
        */
        initialImage = 1;
    }
    //image2 is the image I'm checking the pixels of.
    width1 = (int)image2.size.width;
    height1 = (int)image2.size.height;
    CFDataRef imageData = CGDataProviderCopyData(CGImageGetDataProvider(image2.CGImage));
    const UInt32 *pixels = (const UInt32*)CFDataGetBytePtr(imageData);
    if ( (pixels[(x1+(y1*width1))]) == 0x000000) { //0x000000 is black right?
        NSLog(@"black!");
        NSLog(@"x = %i", x1);
        NSLog(@"y = %i", y1);
    }else {
        NSLog(@"val: %lu", (pixels[(x1+(y1*width1))]));
        NSLog(@"x = %i", x1);
        NSLog(@"y = %i", y1);
        x1 ++;
        if (x1 >= width1) {
            y1 ++;
            x1 = 0;
        }
    }
    if (y1 > height1) {
        /*
        MY UPDATE IMAGE CODE GOES HERE (IMAGE CHANGES EVERY TIME ALL PIXELS HAVE BEEN CHECKED
        */
        y1 = 0;
        x1 = 0;
    }

Also what if a pixel is really close to black but not perfectly black... Can I add a margin of error in there somewhere so it will still detect pixels that are like 95% black? Thanks!

解决方案

Why are you using a timer at all? Why not just have a double for loop in your function that loops over all possible x- and y-coordinates in the image? Surely that would be waaaay faster than only checking at most 100 pixels per second. You would want to have the x (width) coordinates in the outer loop and the y (height) coordinates in the inner loop so that you are effectively scanning one column of pixels at a time from left to right, since you are trying to find the leftmost black pixel.

Also, are you sure that each pixel in your image has a 4-byte (Uint32) representation? A standard bitmap would have 3 bytes per pixel. To check if a pixel is close to black, you would just examine each byte in the pixel separately and make sure they are all less than some threshold.

EDIT: OK, since you are using UIGetScreenImage, I'm going to assume that it is 4-bytes per pixel.

const UInt8 *pixels = CFDataGetBytePtr(imageData);
UInt8 blackThreshold = 10; // or some value close to 0
int bytesPerPixel = 4;
for(int x = 0; x < width1; x++) {
  for(int y = 0; y < height1; y++) {
    int pixelStartIndex = (x + (y * width1)) * bytesPerPixel;
    UInt8 alphaVal = pixels[pixelStartIndex]; // can probably ignore this value
    UInt8 redVal = pixels[pixelStartIndex + 1];
    UInt8 greenVal = pixels[pixelStartIndex + 2];
    UInt8 blueVal = pixels[pixelStartIndex + 3];
    if(redVal < blackThreshold && blueVal < blackThreshold && greenVal < blackThreshold) {
      //This pixel is close to black...do something with it
    }
  }
}

If it turns out that bytesPerPixel is 3, then change that value accordingly, remove the alphaVal from the for loop, and subtract 1 from the indices of the red, green, and blue values.

Also, my current understanding is that UIGetScreenImage is considered a private function that Apple may or may not reject you for using.

这篇关于检测图像iOS中的黑色像素的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆