从YUV创建CVPixelBuffer并支持IOSurface [英] Create CVPixelBuffer from YUV with IOSurface backed
问题描述
所以我在网络回调(voip app)的3个独立阵列中获取原始YUV数据。根据我的理解,根据 CVPixelBufferCreateWithPlanarBytes 创建IOSurface支持的像素缓冲区/_index.htmlrel =noreferrer>这里
So I am getting raw YUV data in 3 separate arrays from a network callback (voip app). From what I understand you cannot create IOSurface backed pixel buffers with CVPixelBufferCreateWithPlanarBytes
according to here
重要提示:您不能使用CVPixelBufferCreateWithBytes()或
CVPixelBufferCreateWithPlanarBytes()with
kCVPixelBufferIOSurfacePropertiesKey。调用
CVPixelBufferCreateWithBytes()或CVPixelBufferCreateWithPlanarBytes()
将导致CVPixelBuffers不是IOSurface支持的
Important: You cannot use CVPixelBufferCreateWithBytes() or CVPixelBufferCreateWithPlanarBytes() with kCVPixelBufferIOSurfacePropertiesKey. Calling CVPixelBufferCreateWithBytes() or CVPixelBufferCreateWithPlanarBytes() will result in CVPixelBuffers that are not IOSurface-backed
所以因此你必须使用 CVPixelBufferCreate
创建它,但是如何将数据从回调传输回 CVPixelBufferRef
你创建了吗?
So thus you have to create it with CVPixelBufferCreate
, but how do you transfer the data from the call back to the CVPixelBufferRef
that you create?
- (void)videoCallBack(uint8_t *yPlane, uint8_t *uPlane, uint8_t *vPlane, size_t width, size_t height, size_t stride yStride,
size_t uStride, size_t vStride)
NSDictionary *pixelAttributes = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
CVPixelBufferRef pixelBuffer = NULL;
CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
width,
height,
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
(__bridge CFDictionaryRef)(pixelAttributes),
&pixelBuffer);
我不确定此后该怎么办?最终我想把它变成一个CIImage,然后我可以使用我的GLKView来渲染视频。人们如何在创建数据时将数据放入缓冲区?
I am unsure what to do afterwards here? Eventually I want to turn this into a CIImage which then I can use my GLKView to render the video. How do people "put" the data into the buffers from when you create it?
推荐答案
我想出来了,这是公平的不重要的。以下是完整的代码。唯一的问题是我收到 BSXPCMessage收到错误消息:连接中断
并且视频显示需要一段时间。
I figured it out and it was fairly trivial. Here is the full code below. Only issue is that I get a BSXPCMessage received error for message: Connection interrupted
and it takes a while for the video to show.
NSDictionary *pixelAttributes = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
CVPixelBufferRef pixelBuffer = NULL;
CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
width,
height,
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
(__bridge CFDictionaryRef)(pixelAttributes),
&pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
uint8_t *yDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
memcpy(yDestPlane, yPlane, width * height);
uint8_t *uvDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
memcpy(uvDestPlane, uvPlane, numberOfElementsForChroma);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
if (result != kCVReturnSuccess) {
DDLogWarn(@"Unable to create cvpixelbuffer %d", result);
}
CIImage *coreImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; //success!
CVPixelBufferRelease(pixelBuffer);
我忘记添加代码来交错两个U和V平面,但这不应该是太糟糕了。
I forgot to add the code to interleave the two U and V planes, but that shouldn't be too bad.
这篇关于从YUV创建CVPixelBuffer并支持IOSurface的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!