在iOS上将CoreGraphics渲染为OpenGL纹理 [英] Render CoreGraphics to OpenGL texture on IOS

查看:77
本文介绍了在iOS上将CoreGraphics渲染为OpenGL纹理的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在iOS上使用 CoreGraphics 非常容易使用,但是可以获取 CoreGraphics 的输出并将其放入 OpenGL Textures中?

Using CoreGraphics on iOS is very easy to use, but it is possible to get the output of CoreGraphics and put it into OpenGL Textures?

最终目标是使用 CGContextDrawPDFPage 渲染性能卓越的pdf,并使用

The final goal is to use CGContextDrawPDFPage to render very performant pdf's and write it into a specific texture id with

OpenGL.glBindTexture(GL_TEXTURE_2D, TextureNativeId);

它看起来确实像 CoreGraphics 不能直接呈现为特定的本机纹理ID".

It does look like CoreGraphics is not able to render directly into a specific "native texture id".

推荐答案

是的,可以通过将Core Graphics内容呈现到位图上下文并将其上载到纹理中来进行.以下是我用来将UIImage绘制到Core Graphics上下文的代码,但是您可以用自己的绘制代码替换 CGContextDrawImage()部分:

Yes, you can, by rendering your Core Graphics content to a bitmap context and uploading that to a texture. The following is code that I use to draw a UIImage to a Core Graphics context, but you could replace the CGContextDrawImage() portion with your own drawing code:

GLubyte *imageData = (GLubyte *) calloc(1, (int)pixelSizeOfImage.width * (int)pixelSizeOfImage.height * 4);
CGColorSpaceRef genericRGBColorspace = CGColorSpaceCreateDeviceRGB();    
CGContextRef imageContext = CGBitmapContextCreate(imageData, (int)pixelSizeOfImage.width, (int)pixelSizeOfImage.height, 8, (int)pixelSizeOfImage.width * 4, genericRGBColorspace,  kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGContextDrawImage(imageContext, CGRectMake(0.0, 0.0, pixelSizeOfImage.width, pixelSizeOfImage.height), [newImageSource CGImage]);
CGContextRelease(imageContext);
CGColorSpaceRelease(genericRGBColorspace);

glBindTexture(GL_TEXTURE_2D, outputTexture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (int)pixelSizeOfImage.width, (int)pixelSizeOfImage.height, 0, GL_BGRA, GL_UNSIGNED_BYTE, imageData);

这假定您已使用如下代码创建纹理:

This assumes that you've created your texture using code like the following:

glActiveTexture(GL_TEXTURE0);
glGenTextures(1, &outputTexture);
glBindTexture(GL_TEXTURE_2D, outputTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
// This is necessary for non-power-of-two textures
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glBindTexture(GL_TEXTURE_2D, 0);

对于快速变化的内容,您可能需要研究iOS 5.0的纹理缓存( CVOpenGLESTextureCacheCreateTextureFromImage()等),这可以让您直接渲染到纹理的字节.但是,我发现使用纹理缓存创建和渲染到纹理的开销使得渲染单个图像的过程稍微慢一些,因此,如果您不需要持续更新此代码,则上面的代码可能是最快的方法.

For rapidly changing content, you might want to look into iOS 5.0's texture caches (CVOpenGLESTextureCacheCreateTextureFromImage() and the like), which might let you render directly to the bytes for your texture. However, I've found that the overhead for creating and rendering to a texture with a texture cache makes this slightly slower for rendering a single image, so if you don't need to continually update this the code above is probably your fastest route.

这篇关于在iOS上将CoreGraphics渲染为OpenGL纹理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆