Recently I was tackling a challenge in JavaScript where the task involved dividing the number of volunteers by the number of neighborhoods.
To achieve this, I decided to use the array method .length
which would return the length of an array. However, what puzzled me was that while one approach passed the test, the other failed. Here are the two sets of code:
The code that passed the test:
const function(volunteers, neighborhood){
let neighborhoodLength = neighborhood;
let volunteersLength = volunteers.length
let evaluate = neighborhoodLength / volunteersLength;
return evaluate;
}
The code that failed the test:
const function(volunteers, neighborhood){
return volunteers.length / neighborhood.length
}
When provided with arrays like the following:
const volunteers = [
'Sally',
'Jake',
'Brian',
'Hamid'
];
const neighbourhoods = [
'Central Valley',
'Big Mountain',
'Little Bridge',
'Bricktown',
'Brownsville',
"Paul's Boutique",
'Clay Park',
'Fox Nest'
];
The expected output should be 2.
My main concern now is understanding why there was a variance in results between the two approaches. I am seeking clarification on why one passed and the other failed. Any insights or assistance in explaining this discrepancy would be greatly appreciated!