glReadPixels通过多次采样返回零 [英] glReadPixels returns zeroes with multi-sampling

查看:571
本文介绍了glReadPixels通过多次采样返回零的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在为iOS编写OpenGL应用程序,我需要获取渲染场景的应用程序内截图。当我不使用多次采样时,一切正常。但是当我打开多次采样时, glReadPixels 不会返回正确的数据(场景绘制正确 - 多次采样时图形质量要好得多)。

I am writing OpenGL app for iOS, and I need to take in-app screenshot of rendered scene. All is working ok when I am not using multi-sampling. But when I turn multi-sampling on, glReadPixels does not return the correct data (scene is drawn correctly - graphics quality is much better with multi-sampling).

我已经在SO和其他一些地方检查过一堆类似的问题,但是没有一个能解决我的问题,因为我已经按照提议的方式做了这些:

I already checked bunch of similar questions at SO, and on some other places, but none of them solve my problem, since I am already doing it on proposed ways:


  1. 解析缓冲区后,我会截取屏幕截图,但是在呈现缓冲区之前。

  2. glReadPixels 不会返回错误。

  3. 我甚至尝试将 kEAGLDrawablePropertyRetainedBacking 设置为并在缓冲区出现后截取屏幕截图 - 也不起作用。

  4. 我支持 OpenGLES 1.x 渲染API(上下文用 kEAGLRenderingAPIOpenGLES1 初始化)

  1. I am taking screenshot after buffers are resolved, but before render buffer is presented.
  2. glReadPixels does not return error.
  3. I tried even to set kEAGLDrawablePropertyRetainedBacking to YES and to take screenshot after buffer is presented - does not work either.
  4. I support OpenGLES 1.x rendering API (context initialised with kEAGLRenderingAPIOpenGLES1)

基本上我出去了想法可能是错的。在SO上发布问题是我的最后一招。

Basically I am out of ideas what can be wrong. Posting question on SO is my last resort.

这是相关的源代码:

创建帧缓冲区

- (BOOL)createFramebuffer
{

    glGenFramebuffersOES(1, &viewFramebuffer);
    glGenRenderbuffersOES(1, &viewRenderbuffer);

    glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer);
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);
    [context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer];
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer);

    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

    // Multisample support

    glGenFramebuffersOES(1, &sampleFramebuffer);
    glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer);

    glGenRenderbuffersOES(1, &sampleColorRenderbuffer);
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleColorRenderbuffer);
    glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_RGBA8_OES, backingWidth, backingHeight);
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, sampleColorRenderbuffer);

    glGenRenderbuffersOES(1, &sampleDepthRenderbuffer);
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);
    glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight);
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, sampleDepthRenderbuffer);

    // End of multisample support

    if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES) {
        NSLog(@"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
        return NO;
    }

    return YES;
}

解析缓冲区部分并拍摄快照

    glBindFramebufferOES(GL_DRAW_FRAMEBUFFER_APPLE, viewFramebuffer);
    glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, sampleFramebuffer);
    glResolveMultisampleFramebufferAPPLE();
    [self checkGlError];

    //glFinish();

    if (capture)
        captureImage = [self snapshot:self];    

    const GLenum discards[]  = {GL_COLOR_ATTACHMENT0_OES,GL_DEPTH_ATTACHMENT_OES};
    glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE,2,discards);

    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);    

    [context presentRenderbuffer:GL_RENDERBUFFER_OES];    

快照方法(基本上是从苹果文档复制而来)

- (UIImage*)snapshot:(UIView*)eaglview
{

    // Bind the color renderbuffer used to render the OpenGL ES view
    // If your application only creates a single color renderbuffer which is already bound at this point,
    // this call is redundant, but it is needed if you're dealing with multiple renderbuffers.
    // Note, replace "_colorRenderbuffer" with the actual name of the renderbuffer object defined in your class.    
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);


    NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight;
    NSInteger dataLength = width * height * 4;
    GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));

    // Read pixel data from the framebuffer
    glPixelStorei(GL_PACK_ALIGNMENT, 4);
    [self checkGlError];
    glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);
    [self checkGlError];

    // Create a CGImage with the pixel data
    // If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
    // otherwise, use kCGImageAlphaPremultipliedLast
    CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
    CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
    CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
                                ref, NULL, true, kCGRenderingIntentDefault);

    // OpenGL ES measures data in PIXELS
    // Create a graphics context with the target size measured in POINTS
    NSInteger widthInPoints, heightInPoints;
    if (NULL != UIGraphicsBeginImageContextWithOptions) {
        // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
        // Set the scale parameter to your OpenGL ES view's contentScaleFactor
        // so that you get a high-resolution snapshot when its value is greater than 1.0
        CGFloat scale = eaglview.contentScaleFactor;
        widthInPoints = width / scale;
        heightInPoints = height / scale;
        UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
    }
    else {
        // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
        widthInPoints = width;
        heightInPoints = height;
        UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
    }

    CGContextRef cgcontext = UIGraphicsGetCurrentContext();

    // UIKit coordinate system is upside down to GL/Quartz coordinate system
    // Flip the CGImage by rendering it to the flipped bitmap context
    // The size of the destination area is measured in POINTS
    CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
    CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);

    // Retrieve the UIImage from the current context
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

    UIGraphicsEndImageContext();

    // Clean up
    free(data);
    CFRelease(ref);
    CFRelease(colorspace);
    CGImageRelease(iref);

    return image;
}


推荐答案

您将多重采样缓冲区解析为通常在将 viewFramebuffer 绑定为绘制帧缓冲区和 sampleFramebuffer glResolveMultisampleFramebufferAPPLE c $ c>作为读取帧缓冲区。但是你还记得将 viewFramebuffer 绑定为读取帧缓冲区( glBindFramebuffer(GL_READ_FRAMEBUFFER,viewFramebuffer))然后 glReadPixels glReadPixels 将始终从当前绑定的读取帧缓冲区读取,如果在多重采样解析后未更改此绑定,则仍将是多重采样帧缓冲区而非默认帧缓冲区。

You resolve the multisample buffers as usual by doing a glResolveMultisampleFramebufferAPPLE after binding the viewFramebuffer as draw framebuffer and the sampleFramebuffer as read framebuffer. But did you also remebember to bind the viewFramebuffer as read framebuffer (glBindFramebuffer(GL_READ_FRAMEBUFFER, viewFramebuffer)) then before the glReadPixels? glReadPixels will always read from the currently bound read framebuffer and if you didn't change this binding after the multisample resolve, this will still be the multisample framebuffer and not the default one.

我还发现你的 glBindRenderbufferOES(GL_RENDERBUFFER_OES,viewRenderbuffer) -calls很烦人,因为那并不是真的任何有意义的东西,当前绑定的渲染缓冲区仅与渲染缓冲区上的函数相关(实际上只有 glRenderbufferStorage )(但也可能是ES对它做了一些有意义的事情并绑定它是需要 [context presentRenderbuffer:GL_RENDERBUFFER_OES] 才能工作。但是,也许你认为这个绑定也控制 glReadPixels 将读取的缓冲区,但这是的情况,它将始终从当前 framebuffer 绑定到 GL_READ_FRAMEBUFFER

I also found your glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer)-calls quite irritating, because that doesn't really do anything meaningful, the currently bound renderbuffer is only relevant for functions working on renderbuffers (practically only glRenderbufferStorage) (but it may also be that ES does something meaningful with it and binding it is required for [context presentRenderbuffer:GL_RENDERBUFFER_OES] to work). But nevertheless maybe you thought that this binding also controls the buffer that glReadPixels will read from, but this is not the case, it will always read from the current framebuffer bound to GL_READ_FRAMEBUFFER.

这篇关于glReadPixels通过多次采样返回零的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆