Currently, I am in the process of converting a mesmerizing shader effect from Shadertoy into a local Three.js project. Despite my efforts, I have been struggling to get it to render properly. If you'd like to give it a shot yourself, you can access the full code snippet here.
The main issue seems to stem from how I am handling the iResolution
variable. In Shadertoy, this built-in global variable typically holds the pixel dimensions of the window. Let's take a look at how iResolution
is utilized in the original Shadertoy:
vec2 uv = fragCoord.xy / iResolution.y;
vec2 ak = abs(fragCoord.xy / iResolution.xy - 0.5);
For the conversion process into a local Three.js script, I've attempted two different approaches when dealing with iResolution
:
1) Initially, I tried loading the window dimensions as a Vector2 and passing them into the shader as the uniform vec2 uResolution
:
vec2 uv = gl_FragCoord.xy / uResolution.y;
vec2 ak = abs(gl_FragCoord.xy / uResolution.xy - 0.5);
This method closely mirrors the structure of the original Shadertoy, but unfortunately, no rendering takes place.
2) The second approach, borrowed from this helpful Stack Overflow answer, involves transforming the uv
coordinates to absolute xy
values:
vec2 uvCustom = -1.0 + 2.0 * vUv;
vec2 ak = abs(gl_FragCoord.xy / uvCustom.xy - 0.5);
To be honest, I'm not entirely clear on its workings, and there might be an error in my use of the uvCustom
variable in the second line.
Ultimately, despite my best efforts, nothing appears on the screen other than a CameraHelper object from Three.js. The rest remains pitch black, and both the JavaScript and WebGL consoles show no signs of error. Thank you for your attention and assistance!