I am trying to perform matrix multiplication in Three.js. In my code, I have an Object3D and I managed to retrieve the correct matrix by using console.log like so:
console.log(scene.getObjectByName("Pointer").matrix)
The output looks something like this:
T…E.Matrix4 {elements: Float32Array[16]} elements: Float32Array[16] 0: 1, 1: 0, 2: 0, 3: 0, 4: 0, 5: 1, 6: 0, 7: 0, 8: 0, 9: 0, 10: 1, 11: 0, 12: -150, 13: 0, 14: 0, 15: 1
Pay attention to the 12th element which has a value of -150 (resulting from obj.translationX(-150)).
var newMat = new THREE.Matrix4();
console.log(scene.getObjectByName("Pointer").matrix.elements)
// output: [1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1]
newMat = newMat.copy(scene.getObjectByName("Pointer").matrix);
console.log(newMat);
// output: elements: Float32Array[16] 1,0,0,0,0,1,0,0,0,0,1,0,0,0,0,1
This results in an identity matrix being returned (indicating that the 12th element is: 0).
Can anyone spot what might be wrong here?
UPDATE: It seems that inside the render loop, newMat.copy(...) works just fine!