使用支持 IOSurface 的 YUV 创建 CVPixelBuffer [英] Create CVPixelBuffer from YUV with IOSurface backed
问题描述
所以我从网络回调(voip 应用程序)获取了 3 个单独数组中的原始 YUV 数据.据我了解,您无法根据 CVPixelBufferCreateWithPlanarBytes 创建支持 IOSurface 的像素缓冲区="noreferrer">这里
So I am getting raw YUV data in 3 separate arrays from a network callback (voip app). From what I understand you cannot create IOSurface backed pixel buffers with CVPixelBufferCreateWithPlanarBytes
according to here
重要提示:您不能使用 CVPixelBufferCreateWithBytes() 或CVPixelBufferCreateWithPlanarBytes() 与kCVPixelBufferIOSurfacePropertiesKey.打电话CVPixelBufferCreateWithBytes() 或 CVPixelBufferCreateWithPlanarBytes()将导致不支持 IOSurface 的 CVPixelBuffers
Important: You cannot use CVPixelBufferCreateWithBytes() or CVPixelBufferCreateWithPlanarBytes() with kCVPixelBufferIOSurfacePropertiesKey. Calling CVPixelBufferCreateWithBytes() or CVPixelBufferCreateWithPlanarBytes() will result in CVPixelBuffers that are not IOSurface-backed
因此您必须使用 CVPixelBufferCreate
创建它,但是如何将数据从调用传输回您创建的 CVPixelBufferRef
?
So thus you have to create it with CVPixelBufferCreate
, but how do you transfer the data from the call back to the CVPixelBufferRef
that you create?
- (void)videoCallBack(uint8_t *yPlane, uint8_t *uPlane, uint8_t *vPlane, size_t width, size_t height, size_t stride yStride,
size_t uStride, size_t vStride)
NSDictionary *pixelAttributes = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
CVPixelBufferRef pixelBuffer = NULL;
CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
width,
height,
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
(__bridge CFDictionaryRef)(pixelAttributes),
&pixelBuffer);
我不确定接下来该做什么?最终我想把它变成一个 CIImage ,然后我可以使用我的 GLKView 来渲染视频.人们如何从您创建数据时将数据放入"缓冲区?
I am unsure what to do afterwards here? Eventually I want to turn this into a CIImage which then I can use my GLKView to render the video. How do people "put" the data into the buffers from when you create it?
推荐答案
我想通了,这很简单.这是下面的完整代码.唯一的问题是我收到 BSXPCMessage received error for message: Connection interrupted
并且视频需要一段时间才能显示.
I figured it out and it was fairly trivial. Here is the full code below. Only issue is that I get a BSXPCMessage received error for message: Connection interrupted
and it takes a while for the video to show.
NSDictionary *pixelAttributes = @{(id)kCVPixelBufferIOSurfacePropertiesKey : @{}};
CVPixelBufferRef pixelBuffer = NULL;
CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault,
width,
height,
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
(__bridge CFDictionaryRef)(pixelAttributes),
&pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
uint8_t *yDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
memcpy(yDestPlane, yPlane, width * height);
uint8_t *uvDestPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
memcpy(uvDestPlane, uvPlane, numberOfElementsForChroma);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
if (result != kCVReturnSuccess) {
DDLogWarn(@"Unable to create cvpixelbuffer %d", result);
}
CIImage *coreImage = [CIImage imageWithCVPixelBuffer:pixelBuffer]; //success!
CVPixelBufferRelease(pixelBuffer);
我忘了添加代码来交错两个 U 和 V 平面,但这应该不会太糟糕.
I forgot to add the code to interleave the two U and V planes, but that shouldn't be too bad.
这篇关于使用支持 IOSurface 的 YUV 创建 CVPixelBuffer的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!