Using the Pythagorean Theorem, I aim to calculate the distance between a series of points in an array by iterating through them and identifying the n
closest points. However, I'm struggling to determine how to compute the distance (d) between each iterated point and the subsequent points for comparison. Let's start with the initial array of points:
var points = [
{ id: 1, x: 0.0, y: 0.0 },
{ id: 2, x: 10.1, y: -10.1 },
{ id: 3, x: -12.2, y: 12.2 },
{ id: 4, x: 38.3, y: 38.3 },
{ id: 5, x: 79.0, y: 179.0 },
];
To proceed, I need to iterate through the points and create a new array that contains the distances between each point and all other points utilizing the Pythagorean theorem:
points.forEach((item) => {
var newArray = [item];
var pt = null;
var d = null;
for (var i = 0; i < points.length; i = i + 1) {
//compare this point with all other points
for (var j = i + 1; j < points.length; j = j + 1) {
//calculate distance
var curr = Math.sqrt(Math.pow(points[i][0] - points[j][0], 2) + Math.pow(points[i][1] - points[j][1], 2));
//record distance between each pair of points in a new array
if (d === null || curr < d) {
o = points.id[i];
pt = points.id[j];
d = curr;
}
}
}
newArray.push = {
"id": o,
"pt": pt,
"d": d
};
console.log(newArray);
});
I've been encountering errors like
Cannot read property '0' of undefined
, suggesting there could be a flaw in my logic. Any advice on what might be going wrong here?