In my scene, I apply the ShaderMaterial
shown below to my objects. It works fine. However, when I enable the WebGLRenderer option logarithmicDepthBuffer
to true
, the Material defined is no longer displayed correctly.
new THREE.ShaderMaterial({
uniforms: {
color1: {
value: new THREE.Color('#3a0000')
},
color2: {
value: new THREE.Color('#ffa9b0')
}
},
vertexShader: `
varying vec3 vNormal;
void main(void){
vNormal = normalMatrix * normalize(normal);
gl_Position = projectionMatrix * modelViewMatrix * vec4(position,1.0);
}`,
fragmentShader: `
uniform vec3 color1;
uniform vec3 color2;
varying vec3 vNormal;
void main(void){
vec3 view_nv = normalize(vNormal);
vec3 nv_color = view_nv * 0.5 + 0.5;
vec3 c = mix(color1, color2, nv_color.r);
gl_FragColor = vec4(c, 1.0);
}`,
side: THREE.DoubleSide,
});
Upon researching a solution to this issue, I came across a helpful SO answer. In a nutshell, the solution involves integrating 4 code snippets into the vertexShader
and fragmentShader
.
Where should I exactly place the provided code snippets within the Vertex shader body and Fragment shader body?
Despite trying various placements, I kept encountering WebGL errors.
THREE.WebGLProgram: shader error: 0 gl.VALIDATE_STATUS false gl.getProgramInfoLog Must have a compiled vertex shader attached. ERROR: 0:63: 'EPSILON' : undeclared identifier
UPDATE playground added: https://codepen.io/anon/pen/gQoaye
If you include the logarithmicDepthBuffer
option in the constructor, you will notice that the ShaderMaterial
stops functioning correctly.
var renderer = new THREE.WebGLRenderer(logarithmicDepthBuffer:true);