从深度重建世界立场是错误的 [英] reconstructed world position from depth is wrong
问题描述
我正在尝试实施延迟的阴影/照明.为了减少缓冲区的数量/大小,我想稍后使用深度纹理来重建世界位置.
I'm trying to implement deferred shading/lighting. In order to reduce the number/size of the buffers I use I wanted to use the depth texture to reconstruct world position later on.
我通过将像素坐标乘以投影矩阵的逆和相机矩阵的逆来实现.这种工作,但位置有点偏离.这是采样的世界位置纹理的绝对差异:
I do this by multiplying the pixel's coordinates with the inverse of the projection matrix and the inverse of the camera matrix. This sort of works, but the position is a bit off. Here's the absolute difference with a sampled world position texture:
作为参考,这是我在第二遍片段着色器中使用的代码:
For reference, this is the code I use in the second pass fragment shader:
vec2 screenPosition_texture = vec2((gl_FragCoord.x)/WIDTH, (gl_FragCoord.y)/HEIGHT);
float pixelDepth = texture2D(depth, screenPosition_texture).x;
vec4 worldPosition = pMatInverse*vec4(VertexIn.position, pixelDepth, 1.0);
worldPosition = vec4(worldPosition.xyz/worldPosition.w, 1.0);
//worldPosition /= 1.85;
worldPosition = cMatInverse*worldPosition_byDepth;
如果我取消对worldPosition/= 1.85的注释,则该位置的重建会更好(在我的几何图形/深度值范围内).在将输出与应有的值进行比较之后(在第三个纹理中存储),我搞砸了才得到了这个值.
If I uncomment worldPosition /= 1.85, the position is reconstructed a lot better (on my geometry/range of depth values). I just got this value by messing around after comparing my output with what it should be (stored in a third texture).
我正在使用0.1近距离,100.0远距离,而我的几何图形最多可达约15远距离. 我知道可能存在精度错误,但这似乎太接近相机了,误差太大了. 我在这里想念任何东西吗?
I'm using 0.1 near, 100.0 far and my geometries are up to about 15 away. I know there may be precision errors, but this seems a bit too big of an error too close to the camera. Did I miss anything here?
推荐答案
如评论中所述: 我没有将深度值从ndc空间转换为剪辑空间. 我应该添加以下行:
As mentioned in a comment: I didn't convert the depth value from ndc space to clip space. I should have added this line:
pixelDepth=pixelDepth*2.0-1.0;
这篇关于从深度重建世界立场是错误的的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!