纹理未生成有效(可能) [英] Texture is not generated valid (probably)

查看:17
本文介绍了纹理未生成有效(可能)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的 WebGL 体积光线投射应用程序快完成了.但是我发现了一个问题.我必须通过 2D 纹理模拟 3D 纹理.这不是问题.我正在从小切片中创建一个巨大的纹理.巨大纹理的尺寸约为 4096x4096px.问题是,在一些例子中(这取决于切片的数量)有创建某物.就像下面的图片一样(我将巨大的纹理填充为白色以使片段更明显).

My app for volume ray casting in WebGL is almost done. But i have found a problem. I have to simulate 3D texture by 2D texture. It is not problem. I am creating a huge texture from small slices. Dimensions of huge texture are about 4096x4096px. Problem is, that in some examples (it depends on number of slices) there is creating sth. like on the image bellow (i filled huge texture to white to be the fragment more visible).

我知道条纹数取决于巨大纹理中的行数.我生成的这个纹理接近 4096x4096px(但不完全是).它可能是 4080x4060 等.我认为,问题是,Three.js 将我的纹理加载到 gpu,但不要将其缩放到 4096x4096 ..所以我正在读取纹理边框上的黑色片段着色器,因为 webgl 仅适用于方形纹理(512x512、1024x1024...等).它导致渲染图像中出现黑色条纹.

I know that num of stripes depends on num of rows in huge texture. I am generating this texture near to 4096x4096px (but not exactly). It could be 4080x4060 etc. I think, problem is, that Three.js load my texture to gpu, but dont scale it to 4096x4096..so i am reading in fragment shader black color on the border of texture, because webgl work only with squared textures (512x512, 1024x1024... etc.). It caused black stripes in rendered image.

问题是,我的 Three.js 应用程序不能与 WebGL Inspector 一起使用..所以我不确定.

Problem is, that my Three.js app doesnt work with WebGL Inspector..so i am not sure.

知道如何解决这个问题吗?

Any idea how to fix that?

谢谢.

托马斯

好的,我发现了问题......和解决方案"......但解决方案效果不佳.我有 2 个数据集.一个可以,第二个还是一样的错误.

Ok, i have found problem..and "solution"...but solution didnt work good. I have 2 datasets. One is OK, second still the same error.

2 种代码变体(每种都适用于一个数据集,但不适用于第二个数据集):

2 variants of code (each one works fine for one dataset but not for second) :

首先)

 dx = mod(slice, numColsInTexture) / numColsInTexture;
 dy = 1.0 - (floor(slice / numColsInTexture) / numRowsInTexture);

第二)

dx = 1.0 -  (mod(slice, numColsInTexture) / numColsInTexture);
dy = (floor(slice / numColsInTexture) / numRowsInTexture);

我真的不知道,为什么它不适用于两个数据集...我试图转储 GPU(WebGL 检查器).两种纹理都在 GPU 上有效加载(相同的方向,相同的尺寸).一切都一样.

I really don't know, why it doesnt work for both datasets...i tried to dump GPU (WebGL inspector). Both textures are loaded valid on GPU (same orientation, same dimensions). Everything is same.

请帮助我....谢谢.

Please help me....thanks.

推荐答案

您的纹理的 MIN 和 MAX 过滤器是否设置为 NEAREST?你的 WRAP[ST] 设置为 CLAMP_TO_EDGE?

Is your MIN and MAX filters of your texture set to NEAREST? And your WRAP[ST] set to CLAMP_TO_EDGE?

这篇关于纹理未生成有效(可能)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆