My friend and I are embarking on a project to create an AR web app for a University Workshop. Our goal is to track the hand movements and overlay a 3D Object on the wrist. However, we are facing two primary challenges:
- How can we utilize the webcam in three.js as part of the scene?
- How can we accurately detect the wrist landmark and place an object on it?
We appreciate any suggestions or guidance you may have.
So far, we have successfully implemented hand detection by using the webcam streaming video as input. However, when trying to integrate this with a scene and camera in Three.js, the components seem to operate independently.
UPDATE: While our existing code functions as expected, there seems to be an issue specifically related to Three.js:
import {
HandLandmarker,
FilesetResolver
} from "https://cdn.jsdelivr.net/npm/@mediapipe/<a href="/cdn-cgi/l/email-protection" class="__cf_email__" data-cfemail="ed998c9e869ec09b849e848283adddc3dcddc3dd">[email protected]</a>";
import * as THREE from 'https://cdn.skypack.dev/<a href="/cdn-cgi/l/email-protection" class="__cf_email__" data-cfemail="582c302a3d3d186876696b6a766a">[email protected]</a>';
// Rest of the code has been left unchanged...
The error message currently displayed in the console is as follows:
three.js:2744 Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'center')
at Sphere.copy (three.js:2744:29)
at Frustum.intersectsObject (three.js:7224:15)
at projectObject (three.js:15269:47)
at projectObject (three.js:15292:7)
at WebGLRenderer.render (three.js:15181:5)
at render (script.js:163:14)
at initializeThreeJs (script.js:167:3)
at HTMLButtonElement.toggleWebcam (script.js:46:5)