如何从CMSampleBuffer获取Y组件来自AVCaptureSession? [英] How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?

查看:81
本文介绍了如何从CMSampleBuffer获取Y组件来自AVCaptureSession?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

嘿那里,我正在尝试使用AVCaptureSession从iphone相机访问原始数据。我遵循Apple提供的指南(此处链接)。

Hey there, I am trying to access raw data from iphone camera using AVCaptureSession. I follow the guide provided by Apple (link here).

来自samplebuffer的原始数据是YUV格式(我这里关于原始视频帧格式是正确的吗?),如何直接获取Y分量的数据存储在samplebuffer中的原始数据。

The raw data from the samplebuffer is in YUV format ( Am I correct here about the raw video frame format?? ), how to directly obtain the data for Y component out of the raw data stored in the samplebuffer.

推荐答案

设置返回原始相机帧的AVCaptureVideoDataOutput时,可以使用如下代码设置帧的格式:

When setting up the AVCaptureVideoDataOutput that returns the raw camera frames, you can set the format of the frames using code like the following:

[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];

在这种情况下指定BGRA像素格式(我用它来匹配OpenGL的颜色格式) ES纹理)。该格式的每个像素按此顺序有一个字节用于蓝色,绿色,红色和alpha。使用它可以很容易地拉出颜色组件,但是你需要通过相机原生的YUV颜色空间进行转换来牺牲一点性能。

In this case a BGRA pixel format is specified (I used this for matching a color format for an OpenGL ES texture). Each pixel in that format has one byte for blue, green, red, and alpha, in that order. Going with this makes it easy to pull out color components, but you do sacrifice a little performance by needing to make the conversion from the camera-native YUV colorspace.

其他支持颜色空间 kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange kCVPixelFormatType_420YpCbCr8BiPlanarFullRange 在较新的设备上和 kCVPixelFormatType_422YpCbCr8 on iPhone 3G。 VideoRange FullRange 后缀仅指示字节是否在16 - 235之间返回Y和16 - 240之间的UV或者每个组件的完整0 - 255。

Other supported colorspaces are kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange and kCVPixelFormatType_420YpCbCr8BiPlanarFullRange on newer devices and kCVPixelFormatType_422YpCbCr8 on the iPhone 3G. The VideoRange or FullRange suffix simply indicates whether the bytes are returned between 16 - 235 for Y and 16 - 240 for UV or full 0 - 255 for each component.

我相信AVCaptureVideoDataOutput实例使用的默认颜色空间是YUV 4:2:0平面颜色空间(iPhone 3G除外) ,其中YUV 4:2:2交错)。这意味着视频帧中包含两个图像数据平面,Y平面首先出现。对于结果图像中的每个像素,该像素处的Y值都有一个字节。

I believe the default colorspace used by an AVCaptureVideoDataOutput instance is the YUV 4:2:0 planar colorspace (except on the iPhone 3G, where it's YUV 4:2:2 interleaved). This means that there are two planes of image data contained within the video frame, with the Y plane coming first. For every pixel in your resulting image, there is one byte for the Y value at that pixel.

您可以通过在此处实现类似的内容来获取此原始Y数据委托回调:

You would get at this raw Y data by implementing something like this in your delegate callback:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);

    unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

    // Do something with the raw pixels here

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
}

然后你可以找出每个X,Y的帧数据中的位置在图像上坐标并拉出与该坐标处的Y分量对应的字节。

You could then figure out the location in the frame data for each X, Y coordinate on the image and pull the byte out that corresponds to the Y component at that coordinate.

来自 WWDC 2010 (可与视频一起访问)显示如何处理每帧的原始BGRA数据。我还创建了一个示例应用程序,您可以下载此处的代码。 ,执行基于颜色的对象跟踪使用iPhone相机的实时视频。两者都展示了如何处理原始像素数据,但这些都不能在YUV颜色空间中工作。

Apple's FindMyiCone sample from WWDC 2010 (accessible along with the videos) shows how to process raw BGRA data from each frame. I also created a sample application, which you can download the code for here, that performs color-based object tracking using the live video from the iPhone's camera. Both show how to process raw pixel data, but neither of these work in the YUV colorspace.

这篇关于如何从CMSampleBuffer获取Y组件来自AVCaptureSession?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆