I am currently developing an application for the Oculus Rift using JavaScript, three.js, and OculusRiftEffect.js.
In order to create a transparent portion of a 2D ring for the menu, I am attempting to generate a MeshBasicMaterial.alphaMap
texture on a WebGLRenderTarget
. However, I am encountering conflicts with OculusRiftEffect.js
as it also utilizes a WebGLRenderTarget
. The specific issue arises during:
The setup for rendering the texture:
menuAlphaScene = new THREE.Scene();
menuAlphaCamera = new THREE.PerspectiveCamera(75, 1, 0.1, 1000); // perhaps an orthographic camera would be more suitable
menuAlphaCamera.position.z = 1;
menuAlphaRT = new THREE.WebGLRenderTarget(512, 512, { minFilter: THREE.LinearFilter, magFilter: THREE.NearestFilter, format: THREE.RGBFormat });
Assigning the texture to the object (note that I am currently using map
for debugging purposes instead of alphaMap
):
menuObject = new THREE.Mesh(new THREE.RingGeometry(0.7, 1, 30, 30), new THREE.MeshBasicMaterial({
side: THREE.DoubleSide,
map: menuAlphaRT
}));
Rendering the texture:
function regenMenuAlphaTexture(sixth) {
var rtScene = new THREE.Scene();
if (sixth >= 0) {
var angle = 2 * Math.PI / 6 * sixth;
var obj = new THREE.Mesh(new THREE.RingGeometry(0.5, 1, 30, 30, angle, angle + 2 * Math.PI / 6), new THREE.MeshBasicMaterial({
color: 0xaaaaaa,
side: THREE.DoubleSide
}));
rtScene.add(obj);
}
renderer.setClearColor(0xff0000, 1);
renderer.render(rtScene, menuAlphaCamera, menuAlphaRT);
}
The code within the requestAnimationFrame
block:
function render() {
update();
renderer.clear();
regenMenuAlphaTexture(0);
oculusRiftEffect.render(scene, camera);
requestAnimationFrame(render);
}
When using oculusRiftEffect
, the texture displays as black. However, when using renderer.render
, the ring renders correctly.
No OpenGL errors are being reported (at least none that I can see in the console). I am unsure of what may be causing this issue.