I am facing an issue with calculating a JSON object and returning the average result.
Here is the JSON object in question:
var testJson = [{
"1": "0.038728952407837",
"2": "0.034420967102051",
"3": "0.034113883972168",
"4": "0.033237934112549",
"5": "0.033545017242432",
"6": "0.033923149108887",
"7": "0.033990859985352",
"8": "0.033454895019531",
"9": "0.033518075942993",
"10": "0.033759117126465",
"11": "0.033965826034546",
"12": "0.03358006477356",
"13": "0.033926010131836",
"14": "0.033300876617432",
"15": "0.033140897750854",
"16": "0.033447027206421",
"17": "0.033830165863037",
"18": "0.033417940139771",
"19": "0.033578157424927",
"20": "0.032893180847168",
}]
Below is the code snippet I have been using:
var arr = testJson[0];
var total = 0;
for (var i = 0; i < arr.length; i++) {
total += arr[i];
}
console.log(total)
The output of this code gives me a concatenated string rather than the expected result.
0.0387289524078370.0344209671020510.0341138839721680.0332379341125490.0335450172424320.0339231491088870.0339908599853520.0334548950195310.0335180759429930.0337591171264650.0339658260345460.033580064773560.0339260101318360.0333008766174320.0331408977508540.0334470272064210.0338301658630370.0334179401397710.0335781574249270.0328931808471680.0339531898498540.0339729785919190.0338070392608640.0332689285278320.0333919525146480.033372879028320.0353031158447270.0355949401855470.0359919071197510.036854982376099
I need help in identifying where I am going wrong in my approach.