I need to determine the color for my legend by comparing two values without using ceil, floor, or round functions. The color is based on a color table with point and RGB values.
The backend provides points like 0.4607441262895224, 0.5500956769649571, etc. I must compare these points to the first value in the color table to assign the correct color. However, precision of only two decimal places is causing issues.
"colors": [
[ 0.00, 255, 13, 186 ],
[ 0.25, 254, 4, 135 ],
[ 0.50, 73, 255, 35 ],
[ 0.75, 185, 116, 255 ],
[ 1.00, 32, 50, 255 ]
]
// main logic
const test = arraySet.find((ele) => {
// dynamic point value
const point = 0.388920938
// Unable to use toFixed(2) here, which needs modification
// ele[0] represents values like 0.00, 0.25, 0.50, etc.
return point.toFixed(2) == ele[0].toFixed(2);}
);