Currently, I am in the process of developing a path tracer using THREE.js. The approach involves rendering a full-screen quad with path tracing taking place in the pixel shader.
To achieve a higher sampling rate, one option is to sample one path for each pixel and then combine the resulting images by averaging them out over multiple shader passes.
While I have been successful in generating the necessary images, I am struggling with how to accumulate them. My initial thought is to utilize two render targets: one for the latest sampled image and another for the average of all previously displayed images.
However, I am unsure of how to extract data from a WebGLRenderTarget and use it to manipulate the data within another render target. Is this even feasible with Three.js? I have been exploring FrameBuffer Objects as a potential solution and going through MrDoob's FBO example at , which seems promising but I'm uncertain if I am on the right track.