OpenGL ES 2d渲染成图像 [英] OpenGL ES 2d rendering into image

查看:167
本文介绍了OpenGL ES 2d渲染成图像的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要在iOS上编写OpenGL ES二维渲染器。它应该将一些原始元素(如线和多边形)绘制到2d图像中(它将呈现矢量图)。哪种方式最适合从该任务中获取OpenGL上下文中的图像?我的意思是,我应该将这些原语渲染成纹理,然后从中获取图像,或者是什么?此外,如果有人提供看起来像我需要的东西(2d GL渲染到图像中)的例子或教程,那将会很棒。在此先感谢!

I need to write OpenGL ES 2-dimensional renderer on iOS. It should draw some primitives such as lines and polygons into 2d image (it will be rendering of vector map). Which way is the best for getting image from OpenGL context in that task? I mean, should I render these primitives into texture and then get image from it, or what? Also, it will be great if someone give examples or tutorials which look like the thing I need (2d GL rendering into image). Thanks in advance!

推荐答案

如果需要渲染OpenGL ES 2-D场景,请将该场景的图像提取到在OpenGL ES之外使用,您有两个主要选择。

If you need to render an OpenGL ES 2-D scene, then extract an image of that scene to use outside of OpenGL ES, you have two main options.

首先是简单渲染场景并使用 glReadPixels()来获取场景和场所的RGBA数据它在一个字节数组中,如下所示:

The first is to simply render your scene and use glReadPixels() to grab RGBA data for the scene and place it in a byte array, like in the following:

GLubyte *rawImagePixels = (GLubyte *)malloc(totalBytesForImage);
glReadPixels(0, 0, (int)currentFBOSize.width, (int)currentFBOSize.height, GL_RGBA, GL_UNSIGNED_BYTE, rawImagePixels);
// Do something with the image
free(rawImagePixels);

第二种,更快的方法是将场景渲染为纹理支持framebuffer对象(FBO),其中纹理由iOS 5.0的纹理缓存提供。我在此答案中描述了这种方法,尽管我没有在那里显示原始数据访问的代码。

The second, and much faster, way of doing this is to render your scene to a texture-backed framebuffer object (FBO), where the texture has been provided by iOS 5.0's texture caches. I describe this approach in this answer, although I don't show the code for raw data access there.

您可以执行以下操作来设置纹理缓存并绑定FBO纹理:

You do the following to set up the texture cache and bind the FBO texture:

    CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)[[GPUImageOpenGLESContext sharedImageProcessingOpenGLESContext] context], NULL, &rawDataTextureCache);
    if (err) 
    {
        NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d");
    }

    // Code originally sourced from http://allmybrain.com/2011/12/08/rendering-to-a-texture-with-ios-5-texture-cache-api/

    CFDictionaryRef empty; // empty value for attr value.
    CFMutableDictionaryRef attrs;
    empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
                               NULL,
                               NULL,
                               0,
                               &kCFTypeDictionaryKeyCallBacks,
                               &kCFTypeDictionaryValueCallBacks);
    attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
                                      1,
                                      &kCFTypeDictionaryKeyCallBacks,
                                      &kCFTypeDictionaryValueCallBacks);

    CFDictionarySetValue(attrs,
                         kCVPixelBufferIOSurfacePropertiesKey,
                         empty);

    //CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget);

    CVPixelBufferCreate(kCFAllocatorDefault, 
                        (int)imageSize.width, 
                        (int)imageSize.height,
                        kCVPixelFormatType_32BGRA,
                        attrs,
                        &renderTarget);

    CVOpenGLESTextureRef renderTexture;
    CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
                                                  rawDataTextureCache, renderTarget,
                                                  NULL, // texture attributes
                                                  GL_TEXTURE_2D,
                                                  GL_RGBA, // opengl format
                                                  (int)imageSize.width, 
                                                  (int)imageSize.height,
                                                  GL_BGRA, // native iOS format
                                                  GL_UNSIGNED_BYTE,
                                                  0,
                                                  &renderTexture);
    CFRelease(attrs);
    CFRelease(empty);
    glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);

然后你可以直接从支持这种纹理的字节读取(以BGRA格式,而不是RGBA的 glReadPixels())使用类似于:

and then you can just read directly from the bytes that back this texture (in BGRA format, not the RGBA of glReadPixels()) using something like:

    CVPixelBufferLockBaseAddress(renderTarget, 0);
    _rawBytesForImage = (GLubyte *)CVPixelBufferGetBaseAddress(renderTarget);
    // Do something with the bytes
    CVPixelBufferUnlockBaseAddress(renderTarget, 0);

但是,如果您只想在OpenGL ES中重复使用图像,则只需要渲染场景到纹理支持的FBO,然后在第二级渲染中使用该纹理。

However, if you just want to reuse your image within OpenGL ES, you just need to render your scene to a texture-backed FBO and then use that texture in your second level of rendering.

我展示渲染到纹理的示例,然后对其执行一些处理,在我的开源 GPUImage 框架中的CubeExample示例应用程序中,如果您希望看到这一点。

I show an example of rendering to a texture, and then performing some processing on it, within the CubeExample sample application within my open source GPUImage framework, if you want to see this in action.

这篇关于OpenGL ES 2d渲染成图像的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆