Imagine this scenario:
var x = 2.175;
console.log(x.toFixed(2)); // 2.17
It may seem surprising, but it's actually expected behavior. The Number literal 2.175
is stored in memory slightly less than the actual value due to IEEE-754 rules. This can be confirmed by running:
console.log(x.toFixed(20)); // 2.17499999999999982236
This is the case for the latest versions of Firefox, Chrome, and Opera on 32-bit Windows systems. However, that's not the main focus here.
The real curiosity lies in how Internet Explorer 6 (!) handles this situation more conventionally:
var x = 2.175;
console.log(x.toFixed(2)); // 2.18
console.log(x.toFixed(20)); // 2.17500000000000000000
Intriguingly, all tested Internet Explorer versions (IE8-11, including MS Edge) exhibit this behavior. Quite perplexing, isn't it?
UPDATE: The plot thickens:
x=1.0;while((x-=0.1) > 0) console.log(x.toFixed(20));
IE Chrome
0.90000000000000000000 0.90000000000000002220
0.80000000000000000000 0.80000000000000004441
0.70000000000000010000 0.70000000000000006661
0.60000000000000010000 0.60000000000000008882
0.50000000000000010000 0.50000000000000011102
0.40000000000000013000 0.40000000000000013323
0.30000000000000015000 0.30000000000000015543
0.20000000000000015000 0.20000000000000014988
0.10000000000000014000 0.10000000000000014433
0.00000000000000013878 0.00000000000000013878
Why the discrepancies between IE and Chrome? And why no difference in the final one? A similar pattern emerges with x=0.1; while(x-=0.01)...
: until nearing zero, toFixed
in IE appears to take shortcuts.
Disclaimer: I acknowledge that floating-point math has its limitations. What remains puzzling is the differing behaviors between IE and other browsers.