My WebGL app for volume ray casting is nearly complete. However, I have encountered an issue with simulating a 3D texture using a 2D texture. Creating a large texture from smaller slices, the dimensions of the huge texture are approximately 4096x4096px. The problem arises when in certain scenarios (depending on the number of slices), there appears to be something like the image below (I filled the huge texture with white to make the fragments more visible).
I understand that the number of stripes depends on the number of rows in the huge texture. While generating this texture close to 4096x4096px (but not exactly), such as 4080x4060 etc., I suspect the issue lies in Three.js loading my texture to the GPU without scaling it to 4096x4096. Consequently, I am seeing black color on the border of the texture in the fragment shader because WebGL only works with square textures (512x512, 1024x1024... etc.), leading to black stripes in the rendered image.
The biggest hurdle is that my Three.js application does not cooperate with WebGL Inspector, leaving me uncertain about whether that is the root cause.
Any suggestions on how to rectify this?
Thanks,
Tomáš
EDIT:
Alright, I've identified the issue and attempted a "solution"...which didn't work effectively. I have two datasets where one is working correctly while the other still presents the same error.
Two variations of code (each functioning well for one dataset but not the other):
First)
dx = mod(slice, numColsInTexture) / numColsInTexture;
dy = 1.0 - (floor(slice / numColsInTexture) / numRowsInTexture);
Second)
dx = 1.0 - (mod(slice, numColsInTexture) / numColsInTexture);
dy = (floor(slice / numColsInTexture) / numRowsInTexture);
I'm unsure why it isn't working for both datasets...I attempted inspecting the GPU (WebGL inspector). Both textures load validly onto the GPU with the same orientation and dimensions. Everything appears identical.
Please assist me....thank you.