My dilemma involves dealing with a server request that may bring back a substantial JSON list (around 100K records, approximately 50 Mb) of points to be presented on a canvas using D3js. To ensure interactivity and save memory, I am keen on rendering them as they stream in. Here's what I've done so far:
Firstly, I activated Chunked transfer encoding at the server end + Additionally, this is what I experimented with on the client side:
d3.json('?json=qDefects&operationid=' + opid) // my request
.on("load", function (json) {
draw(json); // although it works, there's a significant delay that I want to eliminate...
})
.on("progress", function (json) {
draw(json); // unfortunately, this fails: json isn't accessible at this point
})
.get();
I'm curious if there's a way to manage the JSON data in portions as it loads. Would restructuring the JSON data be beneficial? Currently, it's a single array setup like this:
[{"x":1, "y":2},{"x":2, "y":3}, // chunk 1
...
{"x":6845, "y":239426},{"x":51235, "y":234762}] // last chunk
Do you think splitting the points into smaller arrays would be advantageous?