Our project involves developing a VR/Stereoscopic web application using three.js to showcase the camera's viewing angle.
For instance, we have set up a "room" scenario with the camera positioned in the center. The camera's rotation is linked to orbital controls.
# create a camera and place it in the scene
camera = new THREE.PerspectiveCamera(90, 1, 0.1, 2000);
camera.position.set(0, 35, 60);
# Attach Orbital controls
controls = new THREE.OrbitControls(camera, element);
controls.target.set(camera.position.x + 0.1, camera.position.y, camera.position.z);
controls.enableZoom = true;
controls.enablePan = true;
# Attach DeviceOrientationControls
controls = new THREE.DeviceOrientationControls(camera, true);
controls.connect();
controls.update();
During each animationFrame call, we examine the camera's vector:
vector = camera.getWorldDirection();
theta = Math.atan2(vector.x, vector.z);
phi = Math.sqrt((vector.x * vector.x) + (vector.z * vector.z));
pitch = Math.atan2(phi, vector.y);
We anticipate the following measurements in radians:
- Theta represents the deviation from the original view direction (Z), similar to looking down from the (Y) axis (imagine a clock on the ground beneath the camera).
- Phi indicates the inclination from the initial view direction (Z) upwards or downwards (think of a clock on the side of the camera).
- Pitch denotes the rotation around the Z-axis when observing down the original Z vector (visualize a clock on the back of the camera).
However, there appears to be a consistent discrepancy in Theta, while Phi and Pitch appear accurate. We convert radians into degrees by applying a formula such as: pitch * (180/Math.PI)
The environment and camera update and rotate effectively, and moving the phone/VR glasses left or right results in a natural "look around the room" effect.
These measurements involve assessing the rotation over one axis at a time while keeping the other two axes facing the original direction mostly unchanged.
What could we possibly overlook in this process?