I am ready to delve into another Map/Reduce query.
Within my database, I have a collection named "example
" which is structured like this:
{
"userid" : "somehash",
"channel" : "Channel 1"
}
The Map/Reduce functions I am using are as follows:
var map = function () {
emit(this.channel, {user:this.userid, count: 1});
}
var reduce = function (key, values) {
var result = {total:0, unique:0};
var temp = [];
values.forEach(function (value) {
result.total += value.count;
if (temp.indexOf(value.user) == -1) {
temp.push(value.user);
}
});
result.unique += temp.length;
return result;
}
However, the results I am getting are quite unexpected:
{ "_id" : "Channel 1", "value" : { "total" : NaN, "unique" : 47 } }
{ "_id" : "Channel 2", "value" : { "total" : NaN, "unique" : 12 } }
{ "_id" : "Channel 3", "value" : { "total" : 6, "unique" : 6 } }
It seems that value.count
is being interpreted as null
, and the "Unique" value is not accurate. My goal is to count all occurrences of each channel and determine the unique occurrences for each user. This means that a document in the collection example
may appear multiple times. I want to track both total occurrences and unique occurrences.
I consulted the MongoDB documentation at but I am still puzzled by the unexpected results. Any insights or suggestions on how to resolve this issue?
Thank you for your guidance and expertise.