While tackling the well-known Ransom Note task, especially from a platform like HackerRank, I delved into experimenting with the execution time of a JavaScript function. My focus shifted slightly from the original task as I explored how the function performed when iterating through arrays of varying lengths.
I meticulously recorded the time it took to iterate through pairs of arrays with the following lengths: 1. 1000 elements 2. 10,000 elements 3. 100,000 elements 4. 200,000 elements 5. 400,000 elements
Naturally, I anticipated that the time taken would increase proportionally with the array's length and was curious to discern any underlying patterns.
To my surprise, the results revealed significant discrepancies in the execution time despite utilizing the same function on identical arrays under similar conditions. At times...
I documented the execution times for each array length within an object, resulting in the following data:
veryVeryBigData: {
'1000': [ 1, 0, 0, 0, 0, 1, 0, 0 ],
'10000': [ 12, 12, 12, 12, 12, 12, 12, 12 ],
'100000': [ 1464, 5498, 5637, 5591, 5389, 5524, 5481, 5440 ],
'200000': [ 5858, 21847, 22704, 22214, 21638, 21845, 21798, 21926 ],
'400000': [ 64027, 91809, 92233, 90515, 92953, 92394, 93374, 104708 ]
}
The data clearly illustrates substantial variations in execution times for certain iterations involving arrays of 100,000 elements or more.
If anyone could shed light on why this occurs or provide insights into further research areas for better comprehension, it would be greatly appreciated.
Below is the code snippet I utilized:
const ten = ["one", "two", "three", "four", "five", "six", "seven", "eight", "nine", "ten" ];
const thousand = Array(100).fill(ten).flat();
const tenThousand = Array(10).fill(thousand).flat();
const hundredThousand = Array(10).fill(tenThousand).flat();
const twoHundredThousand = Array(2).fill(hundredThousand).flat();
const fourHundredThousand = Array(4).fill(hundredThousand).flat();
const wordsArrays = [thousand, tenThousand, hundredThousand, twoHundredThousand, fourHundredThousand];
const veryVeryBigData = { 1000: [], 10000: [], 100000: [], 200000: [], 400000: [] };
const checkMagazine = (mag, note) => {
const start = Date.now();
let iterations = 0;
let result = "Yes";
let magazine = [...mag];
note.map((w, i) => {
iterations++;
const index = magazine.indexOf(w);
if (magazine.includes(w)) {
magazine.splice(index, 1);
} else {
result = "No";
}
});
const totalTime = Date.now() - start;
veryVeryBigData[note.length].push(totalTime);
console.log("Result: ", result);
};
const performChecks = (mags, ns) => {
mags.map(magazine => {
ns.map(note => {
if (note.length === magazine.length) {
checkMagazine(magazine, note);
}
});
});
};
//for experimental purposes, compare two identical arrays to solely calculate iteration time
//perform 8 comparisons to observe diverging results
for (let i = 0; i <= 8; i++) {
performChecks(wordsArrays, wordsArrays);
}
console.log("veryVeryBigData: ", veryVeryBigData);
Your insight and feedback would be highly valued!