As I delved into benchmarking JavaScript and .NET Core to select a server-side framework for offering specific RESTful services that required iterating through large arrays (approximately 2.1 billion), an interesting observation caught my attention. While experimenting with some simple code, I noticed unusual behavior in Node.js after reaching a certain number of iterations. To validate this finding, I conducted tests on various platforms including:
- macOS Catalina (Node.js v12.18) - Intel Core i9 4GHz 6 core
- Linux CentOS 7 (Node.js v12.18) VM - Intel Core i9 4GHz 2 core
- Google Chrome Version 84.0.4147.105 (Official Build) (64-bit)
- Mozilla Firefox version 78.2
Sample Codes:
1. Node.js:
var cnt = 0;
var logPeriod=100000000;
var max=10000000000;
for (let i = 0; i < max; i++) {
if (i % logPeriod === 0) {
// var end = Date.now();
if (i !== 0) {
console.timeEnd(cnt*logPeriod, i);
cnt++;
}
console.time(cnt*logPeriod);
}
}
2. Browser:
<!DOCTYPE html>
<html>
<head>
<script>
function doloop() {
var cnt = 0;
var logPeriod = 100000000;
var max = 10000000000;
for (let i = 0; i < max; i++) {
if (i % logPeriod === 0) {
// var end = Date.now();
if (i !== 0) {
console.timeEnd(cnt * logPeriod, i);
cnt++;
}
console.time(cnt * logPeriod);
}
}
}
</script>
</head>
<body>
<button onclick="doloop()">doloop</button>
</body>
</html>