如何获得相机屏幕上指定点的像素的RGB值? [英] How can I get RGB value of pixel at specified point on the camera screen?

查看:156
本文介绍了如何获得相机屏幕上指定点的像素的RGB值?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想获取相机屏幕上指定点的颜色值(不进行捕获),以RGB表示.

I want to get color value of specified point on the camera screen (without capturing) in terms of RGB.

我有以下代码段,但它提供的是视图背景色的值,而不是相机屏幕上的图片.

I had following code snippet, but it gives value of view's background color not the picture on camera screen.

CGPoint point=CGPointMake(100,200);
unsigned char pixel[4] = {0};
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, kCGImageAlphaPremultipliedLast);

CGContextTranslateCTM(context, -point.x, -point.y);
[self.view.layer renderInContext:context];

CGContextRelease(context);
CGColorSpaceRelease(colorSpace);

NSLog(@"pixel: R=%d G=%d B=%d Alpha=%d", pixel[0], pixel[1], pixel[2], pixel[3]);

推荐答案

假定您要实时执行此操作(而不是使用截屏,而您明确表示您不需要 ):

Assuming you want to do this in real time (rather than using a screen capture, which you specifically say you don't want):

您首先需要捕获视频缓冲区 作为苹果在这里概述了 .

You first need to be capturing the video buffer as outlined by Apple here.

然后,您可以使用CMSampleBufferRef做您喜欢的事情. Apple的示例应用程序制作了UIImage,但是您可以简单地将其复制到unsigned char指针中(通过CVImageBufferRefCVPixelBufferRef),然后提取所关注像素的BGRA值,类似这样(未经测试)代码:示例适用于100x,200y像素):

Then you can do what you like with the CMSampleBufferRef. Apple's sample app makes a UIImage, but you can simply copy it across into an unsigned char pointer (via a CVImageBufferRef or CVPixelBufferRef) and then pull the BGRA value of the pixel in question, something like this (untested code: example is for the pixel at 100x,200y):

int x = 100;
int y = 200;
CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
tempAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
int bufferSize = bytesPerRow * height;
uint8_t *myPixelBuf = malloc(bufferSize);
memmove(myPixelBuf, tempAddress, bufferSize);
tempAddress = nil;
// remember it's BGRA data
int b = myPixelBuf[(x*4)+(y*bytesPerRow)];
int g = myPixelBuf[((x*4)+(y*bytesPerRow))+1];
int r = myPixelBuf[((x*4)+(y*bytesPerRow))+2];
free(myPixelBuf);
NSLog(@"r:%i g:%i b:%i",r,g,b);

这将获得相对于视频源本身的像素的位置,而这可能并不是您想要的:如果要使像素的位置与iPhone显示屏上显示的位置相同,则可能需要缩放比例.

This gets the position relative to the pixels of the video feed itself, which may not be what you want: if you want the position of the pixel as displayed on iPhone's display, you may need to scale this.

这篇关于如何获得相机屏幕上指定点的像素的RGB值?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆