I recently wrote some code and encountered an unexpected result:
var a = new Array(10); // should be [undefined * 10]
var b = _.every(a, function(m){
if(_.isUndefined(m)){
return false;
}
return true;
});
I anticipated that b would be 'false', but to my surprise it returned 'true'. Why did it return 'true' instead?
Curious about this outcome, I made a modification:
var c = [undefined, undefined];
var d = _.every(c, function(m){
if(_.isUndefined(m)){
return false;
}
return true;
});
In contrast to the previous case, d returned 'false'. What caused these different results?
If you're interested, feel free to test this scenario using http://jsfiddle.net/3qj4B/3/