After coming across this function online, I decided to optimize it.
For instance, if we have an input of [1, 2, 3]
, the corresponding output would be
[[1], [2], [1, 2], [3], [1, 3], [2, 3], [1, 2, 3]]
Below is the code snippet:
const combinations = arr => {
let parentValue = [];
let cursor = 0;
for (const parentThis of arr) {
const value = [[parentThis]];
for (const thiss of parentValue) {
value.push(thiss.concat([parentThis]));
}
parentValue = parentValue.concat(value);
}
return parentValue;
};
(I apologize for the unconventional variable names; it's due to executing this as a MongoDB aggregation)
Executing this operation 10,000 times on an array consisting of 10 elements takes approximately 23 seconds. How can I enhance its performance? I'm willing to make some compromises.
One immediate enhancement I noticed was reducing the number of elements. When executed 10,000 times on an array with 9 elements, the time taken decreased to 9 seconds.
I suspect that rejecting certain outputs before they are created (such as single-element combinations and all-elements combinations which might not be very useful) could lead to further improvements, but I’m struggling to implement this in code.
I attempted to enhance the performance by eliminating the initial iteration (by providing [arr[0]]
as the initialValue
and commencing the outer loop from the second element), although it didn’t result in a noticeable difference.