glsl sampler2DShadow和shadow2D澄清 [英] glsl sampler2DShadow and shadow2D clarification

查看:992
本文介绍了glsl sampler2DShadow和shadow2D澄清的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

快速了解我所处的位置(以确保我们在同一页面上,并进行完整性检查,以了解我是否缺少/假设有愚蠢的东西):

Quick background of where I'm at (to make sure we're on the same page, and sanity check if I'm missing/assuming something stupid):

  • 目标:我想使用延迟照明将阴影渲染为场景 和阴影贴图.
  • 努力:找到有关如何使用shadow2D和sampler2DShadow的清晰且一致的文档.
  • Goal: I want to render my scene with shadows, using deferred lighting and shadowmaps.
  • Struggle: finding clear and consistent documentation regarding how to use shadow2D and sampler2DShadow.

这是我目前正在做的事情:

Here's what I'm currently doing:

在我的最终渲染过程的片段着色器(实际上是计算最终碎片值的片段着色器)中,从光源的角度来看,我具有该过程的MVP矩阵,从所述过程获得的深度纹理(又称为阴影贴图") ",以及我的几何缓冲区中的位置/法线/颜色纹理.

In the fragment shader of my final rendering pass (the one that actually calculates final frag values), I have the MVP matrices from the pass from the light's point of view, the depth texture from said pass (aka the "shadow map"), and the position/normal/color textures from my geometry buffer.

据我了解,我需要找到当前片段位置对应的阴影贴图的UV.我通过以下方式做到这一点:

From what I understand, I need to find what UV of the shadow map the position of the current fragment corresponds to. I do that by the following:

//Bring position value at fragment (in world space) to screen space from lights POV
vec4 UVinShadowMap = (lightProjMat * lightViewMat * vec4(texture(pos_tex, UV).xyz,1.0)).xy; 
//Convert screen space to 'texture space' (from -1to1 to 0to1)
UVinShadowMap = (UVinShadowMap+1)/2;

现在,我有了紫外线,我可以用光的视角获得已知的深度"

Now that I have this UV, I can get the percieved 'depth' from the light's pov with

float depFromLightPOV = texture2D(shadowMap, UVinShadowMap).r;

并将其与当前片段位置和灯光之间的距离进行比较:

and compare that against the distance between the position at the current fragment and the light:

float actualDistance = distance(texture2D(pos_tex, UV).xyz, lightPos);

问题出在深度"存储在值0-1中,而实际距离在世界坐标中.我已经尝试过手动进行转换,但是无法正常工作.在网上搜索时,似乎应该使用sampler2DShadow ...

The problem comes from that 'depth' is stored in values 0-1, and actual distance is in world coordinates. I've tried to do that conversion manually, but couldn't get it to work. And in searching online, it looks like the way I SHOULD be doing this is with a sampler2DShadow...

所以这是我的问题:

要使用Shadow2D进行哪些更改? shadow2D甚至可以做什么?是从深度到世界的自动转换还是差不多?我可以使用相同的深度纹理吗?还是我需要以其他方式渲染深度纹理?我要如何传递给shadow2D?我要检查的片段的世界空间位置?还是以前一样的紫外线?

What changes do I need to make to instead use shadow2D? What does shadow2D even do? Is it just more-or-less an auto-conversion-from-depth-to-world texture? Can I use the same depth texture? Or do I need to render the depth texture a different way? What do I pass in to shadow2D? The world-space position of the fragment I want to check? Or the same UV as before?

如果所有这些问题都可以在一个简单的文档页面中回答,那么我希望有人可以将其发布.但是我发誓我已经搜索了几个小时,找不到任何能简单说明shadow2D到底发生了什么的事情!

If all these questions can be answered in a simple documentation page, I'd love if someone could just post that. But I swear I've been searching for hours and can't find anything that simply says what the heck is going on with shadow2D!

谢谢!

推荐答案

首先,您使用的是哪个版本的GLSL?

从GLSL 1.30开始,没有与sampler2DShadow一起使用的特殊纹理查找功能(无论如何仍为名称). GLSL 1.30+使用了一堆texture (...)重载,这些重载是根据传递的sampler类型和坐标尺寸选择的.

First of all, what version of GLSL are you using?

Beginning with GLSL 1.30, there is no special texture lookup function (name anyway) for use with sampler2DShadow. GLSL 1.30+ uses a bunch of overloads of texture (...) that are selected based on the type of sampler passed and the dimensions of the coordinates.

  1. 必须启用纹理比较,否则您将获得未定义的结果

  1. Texture comparison must be enabled or you will get undefined results

  • GL_TEXTURE_COMPARE_MODE = GL_COMPARE_REF_TO_TEXTURE​

您传递给texture (...)的坐标是3D而不是2D.新的第三坐标是您要比较的深度值.

The coordinates you pass to texture (...) are 3D instead of 2D. The new 3rd coordinate is the depth value that you are going to compare.

最后,您应该了解使用sampler2DShadowtexture (...)返回的内容:

如果此比较通过,则texture (...)将返回 1.0 ,如果失败,则将返回 0.0 .如果在深度纹理上使用GL_LINEAR纹理过滤器,则texture (...)将使用深度纹理中的4个最接近的深度值执行4次深度比较,并返回介于介于 1.0 和 0.0 可以了解通过/失败的样本数量.

Last, you should understand what texture (...) returns when using sampler2DShadow:

If this comparison passes, texture (...) will return 1.0, if it fails it will return 0.0. If you use a GL_LINEAR texture filter on your depth texture, then texture (...) will perform 4 depth comparisons using the 4 closest depth values in your depth texture and return a value somewhere in-between 1.0 and 0.0 to give an idea of the number of samples that passed/failed.

这是对阴影贴图进行硬件抗锯齿的正确方法.如果您尝试将常规sampler2DGL_LINEAR结合使用并自己进行深度测试,则将得到一个平均深度返回值和布尔值合格/不合格结果,而不是上面针对sampler2DShadow所述的行为.

That is the proper way to do hardware anti-aliasing of shadow maps. If you tried to use a regular sampler2D with GL_LINEAR and implement the depth test yourself you would get a single averaged depth back and a boolean pass/fail result instead of the behavior described above for sampler2DShadow.

关于从世界空间位置获取要测试的深度值,您处在正确的轨道上(尽管您忘记了透视划分).

As for getting a depth value to test from a world-space position, you were on the right track (though you forgot perspective division).

  1. 将世界空间位置乘以(灯光的)投影并查看矩阵
  2. 通过其W分量将所得坐标除以
  3. 将结果(在[-1,1]范围内)缩放并偏差到[0,1]范围内
  1. Multiply the world-space position by your (light's) projection and view matrices
  2. Divide the resulting coordinate by its W component
  3. Scale and bias the result (which will be in the range [-1,1]) into the range [0,1]

最后一步假设您使用的是默认深度范围...如果您尚未调用glDepthRange (...),则此方法有效.

The final step assumes you are using the default depth range... if you have not called glDepthRange (...) then this will work.

第3步的最终结果用作 两者 深度值(R)和纹理坐标 >(ST)查找深度图.这样就可以将该值直接 传递给texture (...).回想一下,纹理坐标的前2个分量始终相同,第3个是要测试的深度值.

The end result of step 3 serves as both a depth value (R) and texture coordinates (ST) for lookup into your depth map. This makes it possible to pass this value directly to texture (...). Recall that the first 2 components of the texture coordinates are the same as always, and that the 3rd is a depth value to test.

这篇关于glsl sampler2DShadow和shadow2D澄清的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆