I am working with an array of objects that contain latitude and longitude points:
var pts = [
{ "X": 52.67528921580262, "Y": 8.373513221740723 },
{ "X": 52.6759657545252, "Y": 8.374114036560059 },
{ "X": 52.682574466310314, "Y": 8.37256908416748 },
{ "X": 52.68356308524067, "Y": 8.373942375183105 },
{ "X": 52.68293869694087, "Y": 8.375487327575684 },
{ "X": 52.67685044320001, "Y": 8.376259803771973 },
{ "X": 52.6756535071859, "Y": 8.379607200622559 },
{ "X": 52.676017795531436, "Y": 8.382096290588379 },
{ "X": 52.68101344348877, "Y": 8.380722999572754 },
{ "X": 52.68351105322329, "Y": 8.383641242980957 },
{ "X": 52.68, "Y": 8.389 }
];
Could someone help me determine the minimum and maximum values for X and Y in this array?