在 WebGL 中采样整数纹理返回奇怪的值 [英] Sampling integer texture in WebGL returns weird values

查看:38
本文介绍了在 WebGL 中采样整数纹理返回奇怪的值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试通过应用

但如果您的 GPU 最大点大小

可能会有红色边框300

一些想法

  • 您是否检查过 JavaScript 控制台的错误?

  • 你是否关闭了纹理过滤?

    不能过滤整数纹理

  • 你的纹理宽度是偶数吗?

    如果不是,您可能需要设置 gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1) 尽管我希望您在此处收到错误,除非您的 Int16Array 是大于宽度 * 高度

I'm trying to render an grayscale image from a 16-bit array buffer in WebGL2, by applying window leveling in the fragment shader. I'v generated the texture as below:

let typedArray = new Int16Array(data);
gl.texImage2D(
        gl.TEXTURE_2D,
        0,
        gl.R16I,
        w, h,
        0,
        gl.RED_INTEGER,
        gl.SHORT,
        typedArray);

and tried to use the data from the fragment shader below:

let fragmentShaderSource = `#version 300 es
    precision highp float;
    precision highp int;
    precision highp isampler2D;

    // our texture
    uniform isampler2D u_image;

    uniform highp float u_windowWidth;
    uniform highp float u_windowCenter;

    in vec2 v_texCoord;
    out vec4 outColor;

    void main() {
        highp float f = float(texture(u_image, v_texCoord).r);
        f = (f - (u_windowCenter - 0.5)) / max(u_windowWidth - 1.0, 1.0) + 0.5;
        f = min(max(f, 0.0), 1.0);
        outColor = vec4(vec3(f), 1.0);
    }
    `;

but this only renders a black screen. Actually, after some debugging, I found that texture(u_image, v_texCoord) had zero values in rgb across all pixels and a (alpha) field had very large (2^29 ~ 2^30) value. I've tried changing precisions in the shader but results were the same.

In order to narrow down the problem scope, I've tried a different approach by splitting the 16-bit integer into gl.RGBA4, which contains 4-bits in each RGBA channels:

let typedArray = new Uint16Array(data);
gl.texImage2D(
        gl.TEXTURE_2D,
        0,
        gl.RGBA4,
        w, h,
        0,
        gl.RGBA,
        gl.UNSIGNED_SHORT_4_4_4_4,
        typedArray);

and combined RGBA values back into 16-bit integer in the fragment shader.

let fragmentShaderSource = `#version 300 es
    precision highp float;
    precision highp int;
    precision highp sampler2D;

    // our texture
    uniform sampler2D u_image;

    uniform highp float u_windowWidth;
    uniform highp float u_windowCenter;

    in vec2 v_texCoord;
    out vec4 outColor;

    void main() {
        highp vec4 rgba_map = texture(u_image, v_texCoord);
        // Combining rgba4 back into int16
        highp f = rgba_map.r * 65536.0 + rgba_map.g * 4096.0 + rgba_map.b * 256.0 + rgba_map.a * 16.0;
        // signed value
        if (f > 32768.0) {
            f = 65536.0 - f;
        }
        f = (f - (u_windowCenter - 0.5)) / max(u_windowWidth - 1.0, 1.0) + 0.5;
        f = min(max(f, 0.0), 1.0);
        outColor = vec4(vec3(f), 1.0);
    }
    `;

and this version rendered the expected image quite well, although the result was a bit noisy due to the conversion. I've also tried some other formats, and those with float type were fine and the integer type formats were all not working. So I think the other parts of the program are fine. I wonder what is wrong with my program.

解决方案

You haven't really posted enough code to debug so let's just make something that works.

function main() {
  const gl = document.querySelector('canvas').getContext('webgl2');
  if (!gl) {
    return alert('need WebGL2');
  }
  const vs = `#version 300 es
  void main() {
    gl_PointSize = 300.0;
    gl_Position = vec4(0, 0, 0, 1);
  }
  `;
  const fs = `#version 300 es
  precision highp float;
  precision highp int;
  precision highp isampler2D;

  // our texture
  uniform isampler2D u_image;

  out vec4 color;
  
  void main() {
    ivec4 intColor = texture(u_image, gl_PointCoord.xy);
    color = vec4(vec3(intColor.rrr) / 10000.0, 1);
  }
  `;
  
  const program = twgl.createProgram(gl, [vs, fs]);
  const tex = gl.createTexture();
  gl.bindTexture(gl.TEXTURE_2D, tex);
  gl.texImage2D(
      gl.TEXTURE_2D,
      0,               // mip level
      gl.R16I,         // internal format
      10,              // width
      1,               // height
      0,               // border
      gl.RED_INTEGER,  // source format
      gl.SHORT,        // source type
      new Int16Array([
        1000, 2000, 3000, 4000, 5000, 6000, 7000, 8000, 9000, 10000
      ]));
  // can't filter integer textures
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
  gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);
  
  gl.useProgram(program);
  
  // no need to set any attributes or
  // uniforms as we're not using attributes
  // and uniforms default to zero so will use
  // texture unit zero
  gl.drawArrays(gl.POINTS, 0, 1);
  
  console.log('max point size:', gl.getParameter(gl.ALIASED_POINT_SIZE_RANGE)[1]);
}
main();

canvas {
  border: 1px solid black;
  background: red;
}

<script src="https://twgljs.org/dist/4.x/twgl-full.min.js"></script>
<canvas></canvas>

Should look like this

but might have red borders if your GPUs max point size < 300

a few ideas

  • did you check the JavaScript console for errors?

  • did you turn off filtering for the texture?

    integer texture can not be filtered

  • is your texture width an even number?

    If not you probably need to set gl.pixelStorei(gl.UNPACK_ALIGNMENT, 1) though I'd have expected you to get an error here unless your Int16Array is larger than width * height

这篇关于在 WebGL 中采样整数纹理返回奇怪的值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆