iPhone图像处理-矩阵卷积 [英] iPhone Image Processing--matrix convolution

查看:63
本文介绍了iPhone图像处理-矩阵卷积的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在iPhone上实现矩阵卷积模糊.以下代码将作为模糊函数的参数提供的UIImage转换为CGImageRef,然后将RGBA值存储在标准C char数组中.

I am implementing a matrix convolution blur on the iPhone. The following code converts the UIImage supplied as an argument of the blur function into a CGImageRef, and then stores the RGBA values in a standard C char array.

    CGImageRef imageRef = imgRef.CGImage;
int width = imgRef.size.width;
int height = imgRef.size.height;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *pixels = malloc((height) * (width) * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * (width);
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(pixels, width, height,
                                             bitsPerComponent, bytesPerRow, colorSpace,
                                             kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);

然后将存储在像素数组中的像素值进行卷积,并存储在另一个数组中.

Then the pixels values stored in the pixels array are convolved, and stored in another array.

unsigned char *results = malloc((height) * (width) * 4);

最后,将这些增加的像素值改回CGImageRef,转换为UIImage,并使用以下代码在函数末尾返回.

Finally, these augmented pixel values are changed back into a CGImageRef, converted to a UIImage, and the returned at the end of the function with the following code.

    context = CGBitmapContextCreate(results, width, height,
                                bitsPerComponent, bytesPerRow, colorSpace,
                                kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);

CGImageRef finalImage = CGBitmapContextCreateImage(context);


UIImage *newImage  = [UIImage imageWithCGImage:CGBitmapContextCreateImage(context)];
CGImageRelease(finalImage);

NSLog(@"edges found");
free(results);
free(pixels);
CGColorSpaceRelease(colorSpace);


return newImage;

这一次完美.然后,一旦图像再次通过滤镜,就返回非常奇怪的,空前的像素值,这些像素值表示不存在的输入像素值.有什么理由为什么这应该第一次起作用,但之后不起作用?下面是整个功能.

This works perfectly, once. Then, once the image is put through the filter again, very odd, unprecedented pixel values representing input pixel values that don't exist, are returned. Is there any reason why this should work the first time, but then not afterward? Beneath is the entirety of the function.

    -(UIImage*) blur:(UIImage*)imgRef {
CGImageRef imageRef = imgRef.CGImage;
int width = imgRef.size.width;
int height = imgRef.size.height;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *pixels = malloc((height) * (width) * 4);
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * (width);
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(pixels, width, height,
                                             bitsPerComponent, bytesPerRow, colorSpace,
                                             kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);


height = imgRef.size.height;
width = imgRef.size.width;
float matrix[] = {1,1,1,1,1,1,1,1,1};
float divisor = 9;
float shift = 0;


unsigned char *results = malloc((height) * (width) * 4);


for(int y = 1; y < height; y++){
    for(int x = 1; x < width; x++){
        float red = 0;
        float green = 0;
        float blue = 0;
        int multiplier=1;

        if(y>0 && x>0){
        int index = (y-1)*width + x;
        red = matrix[0]*multiplier*(float)pixels[4*(index-1)] +
        matrix[1]*multiplier*(float)pixels[4*(index)] +
        matrix[2]*multiplier*(float)pixels[4*(index+1)];
        green = matrix[0]*multiplier*(float)pixels[4*(index-1)+1] +
        matrix[1]*multiplier*(float)pixels[4*(index)+1] +
        matrix[2]*multiplier*(float)pixels[4*(index+1)+1];
        blue = matrix[0]*multiplier*(float)pixels[4*(index-1)+2] +
        matrix[1]*multiplier*(float)pixels[4*(index)+2] +
        matrix[2]*multiplier*(float)pixels[4*(index+1)+2];

        index = (y)*width + x;

        red = red+ matrix[3]*multiplier*(float)pixels[4*(index-1)] +
        matrix[4]*multiplier*(float)pixels[4*(index)] +
        matrix[5]*multiplier*(float)pixels[4*(index+1)];
        green = green + matrix[3]*multiplier*(float)pixels[4*(index-1)+1] +
        matrix[4]*multiplier*(float)pixels[4*(index)+1] +
        matrix[5]*multiplier*(float)pixels[4*(index+1)+1];
        blue = blue + matrix[3]*multiplier*(float)pixels[4*(index-1)+2] +
        matrix[4]*multiplier*(float)pixels[4*(index)+2] +
        matrix[5]*multiplier*(float)pixels[4*(index+1)+2];


        index = (y+1)*width + x;
        red = red+ matrix[6]*multiplier*(float)pixels[4*(index-1)] +
        matrix[7]*multiplier*(float)pixels[4*(index)] +
        matrix[8]*multiplier*(float)pixels[4*(index+1)];
        green = green + matrix[6]*multiplier*(float)pixels[4*(index-1)+1] +
        matrix[7]*multiplier*(float)pixels[4*(index)+1] +
        matrix[8]*multiplier*(float)pixels[4*(index+1)+1];
        blue = blue + matrix[6]*multiplier*(float)pixels[4*(index-1)+2] +
        matrix[7]*multiplier*(float)pixels[4*(index)+2] +
        matrix[8]*multiplier*(float)pixels[4*(index+1)+2];

        red = red/divisor+shift;
        green = green/divisor+shift;
        blue = blue/divisor+shift;

        if(red<0){
            red=0;
        }
        if(green<0){
            green=0;
        }
        if(blue<0){
            blue=0;
        }

        if(red>255){
            red=255;
        }
        if(green>255){
            green=255;
        }
        if(blue>255){
            blue=255;
        }

        int realPos = 4*(y*imgRef.size.width + x);

        results[realPos] = red;
        results[realPos + 1] = green;
        results[realPos + 2] = blue;
        results[realPos + 3] = 1;

            }else {
         int realPos = 4*((y)*(imgRef.size.width) + (x));
         results[realPos] = 0;
         results[realPos + 1] = 0;
         results[realPos + 2] = 0;
         results[realPos + 3] = 1;
         }

    }
}

context = CGBitmapContextCreate(results, width, height,
                                bitsPerComponent, bytesPerRow, colorSpace,
                                kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);

CGImageRef finalImage = CGBitmapContextCreateImage(context);


UIImage *newImage  = [UIImage imageWithCGImage:CGBitmapContextCreateImage(context)];
CGImageRelease(finalImage);

free(results);
free(pixels);
CGColorSpaceRelease(colorSpace);


return newImage;}

谢谢!

推荐答案

问题是我假设alpha值需要像RGB值一样进行计算.

The problem was that I was assuming the alpha value, needed to calculate it like the RGB values.

这篇关于iPhone图像处理-矩阵卷积的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆