I've set up a scene with a sphere illuminated by a DirectionalLight to simulate the sun shining on Earth. My goal is to incorporate a shader that displays the earth at night on the unlit portions of the globe and during the day on the lit areas. Eventually, I plan to animate the DirectionalLight moving around the globe to update the shader based on real-time shadow positions. I stumbled upon a CodePen example that partially achieves what I'm aiming for: https://codepen.io/acauamontiel/pen/yvJoVv
In the provided CodePen demo, the day/night textures depend on the camera's view relative to the globe. However, my requirement is to tie these textures to the position of the light source rather than the camera.
constructor(selector) {
// Constructor function code here...
}
// Additional functions defined for setting up the scene, camera, renderer, controls, lights, rendering loop, etc...
get dayNightShader() {
return {
// Vertex and fragment shaders code here...
}
}
init() {
// Initialization code for setting up the scene, camera, lights, renderer, etc...
}
}
let canvas = new Canvas('#canvas');
canvas.init();
Based on my observations, it seems like the shader updates according to the camera settings within the get dayNightShader() method. The modelViewMatrix, projectionMatrix, and normalMatrix seem to rely on the camera parameters. I attempted to modify these matrices to use a fixed vector position instead of the camera viewpoint, but doing so only resulted in displaying the atmosphere texture while hiding the globe. Is there a way to utilize the light source's position to dictate the shader output rather than relying on the camera?