I am looking to blur a panoramic view to create a user-interface in front of the blurred view. To achieve this, I am using a fragment-shader-based implementation to compute the blurred image on the client side.
Everything works perfectly when using the regular renderer. However, when I switch to using THREE.StereoEffect
to render the scene, the blurred image does not show up on the screen.
For a demonstration, you can check out the attached snippet (jsfiddle link: https://jsfiddle.net/n988sg96/3/): Toggling the blur button works as expected. But when you toggle the stereo mode and then try to apply the blur, the screen goes black (indicating that the blurred image is not rendering).
The creation of the blurred image is handled by the createBlurredTexture()
function, using the same renderer as the scene and two render-targets for the vertical and horizontal blur passes.
I have confirmed that both render-targets contain the correct images by exporting them as images using
renderer.readRenderTargetPixels()
, regardless of whether the stereo mode is active or not.
My questions are:
- Why is the texture from the RenderTarget not rendering with the StereoEffect?
- Are there any other similar options to achieve the desired effect?
// JavaScript code snippet will go here
/* CSS code snippet will go here */
<!-- External script imports will go here -->