Currently, I am in the process of developing a standalone Javascript application using Spine and Node.js to create an interactive 'number property' explorer. This application allows users to select any number and discover its various properties such as whether it is prime, triangular, etc. You can check out an earlier version of this project here.
While I have successfully implemented features for numbers 1-10k, my goal is to extend the functionality to include properties for numbers up to 1 million or even 1 billion.
To achieve this without relying on a server backend, I plan to have the client download a set of static data files that will be used to present information to the user. Currently, I am utilizing JSON for these data files. For simpler data, I have algorithms in place to derive the required information on the client-side. However, for more complex computations, I pre-compute them and store the values in JSON format.
In my quest to efficiently manage large datasets, I experimented with implementing a pure Javascript bloom filter and incorporating CONCISE bitmaps, but eventually realized that regardless of how compressed my data is, it still needs to be represented in JSON.
Now, I face the challenge of displaying 30 properties for each number, resulting in around 30 million data points for a million numbers. My aim is to find a way to send this vast amount of data to my Javascript app efficiently without burdening the user with massive downloads.
What are some possible solutions for effectively transmitting these extensive datasets to my Javascript application only? Is converting to binary and reading binary on the client-side a viable option? Any examples would be greatly appreciated!