When utilizing three.js, programmers can leverage classes like OculusRiftEffect, VREffect, or VRRenderer to render their scenes on an Oculus Rift device.
An additional helpful class in three.js is the EffectComposer, which allows for the composition of multiple scenes to be displayed by a single renderer.
The query at hand is: how can one showcase the output from EffectComposer with the Oculus Rift?
The issue arises due to the fact that the aforementioned Rift classes require initialization with a renderer such as WebGLRenderer. They must then be called within the render loop to display the scene on the Rift, like so:
this.vrrenderer.render(this.threeScene, this.camera);
Likewise, the EffectComposer also necessitates initialization with a renderer like WebGLRenderer. It should be called in the render loop to display the composed scene through the renderer:
this.composer.render();
Nevertheless, there lies an obstacle as the EffectComposer cannot be initialized with an OculusRiftEffect, VREffect, or VRRenderer instead of the expected WebGLRenderer.
Hence, the question remains: how does one bridge the gap between the EffectComposer and the Rift classes for rendering purposes?
Your insights are greatly appreciated!