CVOpenGLESTexture 方法类型的官方文档在哪里? [英] Where is the official documentation for CVOpenGLESTexture method types?

查看:22
本文介绍了CVOpenGLESTexture 方法类型的官方文档在哪里?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我尝试了 google 和 stackoverflow,但似乎找不到以 CVOpenGLESTexture 开头的函数的官方文档.我可以看到它们来自核心视频,我知道它们是在 iOS 5 上添加的,但搜索文档并没有给我任何信息.

I tried google and stackoverflow but I cant seem to find the oficial documentation for functions that start with CVOpenGLESTexture. I can see they are from core Video, and I know they were added on iOS 5 but searching the documentation doesnt give me anything.

我正在寻找有关参数、它们的作用、如何使用它们等的信息,就像在其他苹果框架中一样.

I am looking for the information about the parameters, what they do, how to use them etc. like in the other apple frameworks.

到目前为止,我所能做的就是命令单击它以查看信息,但这感觉非常奇怪.或者有没有办法添加它,以便它可以显示在 xcode 右侧的快速帮助中?

So far all I can do is command click on it to see the information but this feels super weird. Or is there a way to add this so it can be displayed on the quick help on the right on xcode?

谢谢,如果这是一个愚蠢的问题.

Thanks and sorry if it is a stupid question.

PD:核心视频参考指南似乎也没有解释这些.

PD: The core Video reference guide doesnt seem to explain these either.

推荐答案

不幸的是,没有关于这些新功能的任何文档.你现在能找到的最好的是在 CVOpenGLESTextureCache.h 头文件中,你会看到函数参数的基本描述:

Unfortunately, there really isn't any documentation on these new functions. The best you're going to find right now is in the CVOpenGLESTextureCache.h header file, where you'll see a basic description of the function parameters:

/*!
    @function   CVOpenGLESTextureCacheCreate
    @abstract   Creates a new Texture Cache.
    @param      allocator The CFAllocatorRef to use for allocating the cache.  May be NULL.
    @param      cacheAttributes A CFDictionaryRef containing the attributes of the cache itself.   May be NULL.
    @param      eaglContext The OpenGLES 2.0 context into which the texture objects will be created.  OpenGLES 1.x contexts are not supported.
    @param      textureAttributes A CFDictionaryRef containing the attributes to be used for creating the CVOpenGLESTexture objects.  May be NULL.
    @param      cacheOut   The newly created texture cache will be placed here
    @result     Returns kCVReturnSuccess on success
*/
CV_EXPORT CVReturn CVOpenGLESTextureCacheCreate(
                    CFAllocatorRef allocator,
                    CFDictionaryRef cacheAttributes,
                    void *eaglContext,
                    CFDictionaryRef textureAttributes,
                    CVOpenGLESTextureCacheRef *cacheOut) __OSX_AVAILABLE_STARTING(__MAC_NA,__IPHONE_5_0);

更难的元素是属性字典,不幸的是,您需要找到示例才能正确使用这些功能.苹果有 GLCameraRippleRosyWriter 炫耀的例子如何使用带有 BGRA 和 YUV 输入颜色格式的快速纹理上传路径.Apple 还在 WWDC 上提供了 ChromaKey 示例(可能仍可与视频一起访问),演示了如何使用这些纹理缓存从 OpenGL ES 纹理中提取信息.

The more difficult elements are the attributes dictionaries, which unfortunately you need to find examples of in order to use these functions properly. Apple has the GLCameraRipple and RosyWriter examples that show off how to use the fast texture upload path with BGRA and YUV input color formats. Apple also provided the ChromaKey example at WWDC (which may still be accessible along with the videos) that demonstrated how to use these texture caches to pull information from an OpenGL ES texture.

我刚刚在我的 GPUImage 框架中实现了这种快速纹理上传(提供了源代码在那个链接上),所以我将列出我能够从中解析的内容.首先,我使用以下代码创建纹理缓存:

I just got this fast texture uploading working in my GPUImage framework (the source code for which is available at that link), so I'll lay out what I was able to parse out of this. First, I create a texture cache using the following code:

CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[GPUImageOpenGLESContext sharedImageProcessingOpenGLESContext] context], NULL, &coreVideoTextureCache);
if (err) 
{
    NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d");
}

这里引用的上下文是为 OpenGL ES 2.0 配置的 EAGLContext.

where the context referred to is an EAGLContext configured for OpenGL ES 2.0.

我使用它来将来自 iOS 设备摄像头的视频帧保存在视频内存中,我使用以下代码来执行此操作:

I use this to keep video frames from the iOS device camera in video memory, and I use the following code to do this:

CVPixelBufferLockBaseAddress(cameraFrame, 0);

CVOpenGLESTextureRef texture = NULL;
CVReturn err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, coreVideoTextureCache, cameraFrame, NULL, GL_TEXTURE_2D, GL_RGBA, bufferWidth, bufferHeight, GL_BGRA, GL_UNSIGNED_BYTE, 0, &texture);

if (!texture || err) {
    NSLog(@"CVOpenGLESTextureCacheCreateTextureFromImage failed (error: %d)", err);  
    return;
}

outputTexture = CVOpenGLESTextureGetName(texture);
glBindTexture(GL_TEXTURE_2D, outputTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

// Do processing work on the texture data here

CVPixelBufferUnlockBaseAddress(cameraFrame, 0);

CVOpenGLESTextureCacheFlush(coreVideoTextureCache, 0);
CFRelease(texture);
outputTexture = 0;

这会从纹理缓存中创建一个新的 CVOpenGLESTextureRef,代表 OpenGL ES 纹理.此纹理基于相机传入的 CVImageBufferRef.然后从 CVOpenGLESTextureRef 中检索该纹理并为其设置适当的参数(这在我的处理中似乎是必要的).最后,我对纹理进行处理并在完成后进行清理.

This creates a new CVOpenGLESTextureRef, representing an OpenGL ES texture, from the texture cache. This texture is based on the CVImageBufferRef passed in by the camera. That texture is then retrieved from the CVOpenGLESTextureRef and appropriate parameters set for it (which seemed to be necessary in my processing). Finally, I do my work on the texture and clean up when I'm done.

这种快速上传过程对 iOS 设备产生了真正的影响.在 iPhone 4S 上上传和处理单个 640x480 帧视频的时间从 9.0 毫秒缩短到了 1.8 毫秒.

This fast upload process makes a real difference on the iOS devices. It took the upload and processing of a single 640x480 frame of video on an iPhone 4S from 9.0 ms to 1.8 ms.

我已经 听说这也可以反向工作,这可能允许在某些情况下替换 glReadPixels(),但我还没有尝试过.

I've heard that this works in reverse, as well, which might allow for the replacement of glReadPixels() in certain situations, but I've yet to try this.

这篇关于CVOpenGLESTexture 方法类型的官方文档在哪里?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆