I've been attempting to create a basic shader for adding noise, but I'm encountering difficulty retrieving the uv coordinates with my current setup.
Fragment Shader:
uniform float seed;
uniform sampler2D pass;
varying vec2 vUv;
void main (){
//noise
vec2 pos = gl_FragCoord.xy;
pos.x *= seed;
pos.y *= seed;
float lum=fract(sin(dot(pos ,vec2(12.9898,78.233))) * 434658.5453116487577816842168767168087910388737310);
vec4 tx = texture2D(pass, vUv);
gl_FragColor = vec4(tx.r*lum,tx.g*lum,tx.b*lum,1.0);
}
Vertex Shader:
varying vec2 vUv;
void main (){
vUv = uv;
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}
Rendering:
OBJECT.material = OBJECT.mat.flat; // THREE MeshPhongMaterial ({color: 0xE40D59,shading:THREE.FlatShading});
RENDERER.render(SCENE,CAMERA,BEAUTY_PASS,false);
OBJECT.material = OBJECT.mat.noise; // THREE ShaderMaterial
RENDERER.render(SCENE,CAMERA);
An error message is displayed:
Error: WebGL: DrawElements: bound vertex attribute buffers do not have sufficient size for given indices from the bound element array @
After some testing, I found that it works if I use the same coordinate for all pixels:
vec4 tx = texture2D(pass, vec2(0.5,0.5));
This displays my object with a reddish noisy color. However, the issue arises when trying to retrieve the uv coordinate on the second render pass (
RENDERER.render(SCENE,CAMERA,BEAUTY_PASS,False)
). It seems to work fine without this initial rendering pass.
Why am I unable to fetch the uv coordinate during the second render? Based on various examples, it should be possible to render using the same scene and camera configuration as shown in this example.