How to handle large dataset in d3js
I have a data set of 11 MB. It's slow to load it every time the document
is loaded.
d3.csv("https://s3.amazonaws.com/vidaio/QHP_Individual_Medical_Landscape.csv",
function(data) {
// drawing code...
});
I know that crossfilter can be used to slice-and-dice the data once it's
loaded in browser. But before that, dataset is big. I only use an
aggregation of the data. It seems like I should pre-process the data on
server before sending it to client. Maybe, use crossfilter on server side.
Any suggestion on how to handle/process large dataset for d3?
No comments:
Post a Comment