Here is the input JSON:
const data = {
"38931": [{
"userT": "z",
"personId": 13424,
"user": {
"id": 38931,
"email": "sample",
},
},
{
"userType": "z",
"personId": 19999,
"user": {
"id": 38931,
"email": "sample",
},
}
],
"77777": [{
"userT": "z",
"personId": 55555,
"user": {
"id": 77777,
"email": "sample",
},
}]
}
The desired output should look like this:
{
"38931": {
"13424": {
"userT": "z",
"personId": 13424,
"user": {
"id": 38931,
"email": "sample"
}
},
"19999": {
"userType": "z",
"personId": 19999,
"user": {
"id": 38931,
"email": "sample"
}
}
},
"77777": {
"55555": {
"userT": "z",
"personId": 55555,
"user": {
"id": 77777,
"email": "sample"
}
}
}
}
While I have achieved the correct result, I am still looking for a more efficient approach to achieve O(n) complexity. The current method involves destructuring in the reducer function which leads to O(n^2) complexity in the last section.
const data = {
"38931": [{
"userT": "z",
"personId": 13424,
"user": {
"id": 38931,
"email": "sample",
},
},
{
"userType": "z",
"personId": 19999,
"user": {
"id": 38931,
"email": "sample",
},
}
],
"77777": [{
"userT": "z",
"personId": 55555,
"user": {
"id": 77777,
"email": "sample",
},
}]
}
const accumulatorList = (acc, id) => {
acc.push(data[id]);
return acc;
}
const accumulatorObject = (acc, [key, value]) => {
const {
user: {
id
}
} = value;
acc[id] = {
...acc[id],
[key]: value
};
return acc;
}
// Concatenate arrays from main object (data), convert to 1D array, transform to hashMap with key "personId",
// then transform into an array for iteration using Object.entries and create the final object result.
const people = Object.keys(data)
.reduce(accumulatorList, [])
.flat()
.reduce((acc, obj) => {
acc[obj.personId] = obj;
return acc;
}, {});
const finalResult =
Object
.entries(people)
.reduce(accumulatorObject, {});
console.log('finalResult', finalResult);
If you have any suggestions on how to achieve O(n) complexity, please let me know. Thank you!
T = O(n)