Currently, I am working on a GPS data visualizer project using Backbone. Each GPS datum is stored in a backbone model and all of the data is saved in a collection. I am curious to know the kind of overhead involved in this process compared to using an array and JSON objects.
My project involves retrieving GPS data for anywhere between 1 and 20+ tracked objects for durations ranging from 5 minutes to 10 hours. There is a data point recorded every second, resulting in an average of 25,000 points per session.
As of now, the project consumes up to 1GB of RAM and tends to become sluggish. Unfortunately, the data cannot be compressed any further.
Thank you in advance for any insights you may have.