My goal is to project a 3D scene onto a 2D plane using ray tracing. Although my ultimate aim is volume rendering, I am currently struggling with the basics. I have set up a three.js scene with the viewing plane attached to the camera in front of it.
The Setup:
In the shader, a ray is shot from the camera through each point (250x250) on the plane. Behind the plane lies a cube volume of 41x41x41 dimensions. If a ray intersects the cube, the corresponding point on the viewing plane turns red; otherwise, it remains black. However, this method only works when looking at the cube from the front. An example can be viewed here:
If you try rotating the camera to view the cube from different angles, you will notice that instead of a proper cube projection, we get a square with some odd pixels on the side.
Raytracing Code:
Vertex Shader:
//Code for inside() function and getDensity() function
void main() {
PointIntensity = getDensity(position);
vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );
gl_Position = projectionMatrix * mvPosition;
}
Fragment Shader:
varying float PointIntensity;
void main() {
//Red points for rays traversing the cube, black for empty space
gl_FragColor= vec4(PointIntensity, 0.0, 0.0, 1.0);
}
Full Code: http://pastebin.com/4YmWL0u1
Running Code:
I would appreciate any tips on where I might have gone wrong with this approach.
EDIT:
I have made changes as suggested by Mark Lundin but still only see a red square when moving the camera (without the weird pixels). The update involved using the inverse of the projection matrix passed as a uniform to unproject the UV coordinates on the viewplane.
New code here: http://pastebin.com/Dxh5C9XX
Updated example running here:
You can press x, y, z to view the current camera coordinates and confirm the camera movement.