I'm currently working on finding the most effective approach for handling large datasets in JavaScript. Specifically, I am dealing with files containing data spanning four hours read at a 100ms interval.
When all the data is loaded into a one-dimensional array, I encounter difficulties loading it all at once. I am considering splitting it into smaller arrays and wondering if there is an optimal size for these arrays.
I would greatly appreciate any advice on this matter as it is my first time encountering such a challenge. Some of the arrays are just 500kb but still do not load properly as single module exports. Does using AJAX offer better performance for some reason? If so, what might be the underlying reasons?