I’d like to use javascript to create and save a large .csv file client-side. This has been asked and answered before (see 1, 2, 3, or 4, for example), and this comes down to two approaches (among others):
Use FileSaver.js to implement w3c’s
saveAs()
:var lines = ["line 1,cell 11,cell 12\n","line 2,cell 21,cell 22\n"]; var type = "text/csv;charset=utf-8"; var blob = new Blob(lines,{"type":type}); saveAs(blob,"download.csv");
Use
a[download]
and a data uri:var lines = ["line 1,cell 11,cell 12\n","line 2,cell 21,cell 22\n"]; var type = "text/csv;charset=utf-8"; var downloader = $('<a download href="data:'+type+','+escape(lines.join(''))+'"></a>') .appendTo("body") downloader[0].click(); downloader.remove();
My problem is that my file can be gigantic: 18 million lines and 2.5 Gb (and I’d like to be able to handle more). Creating lines
uses too much memory and crashes the site. But there’s no reason to store lines
in the browser’s memory just to save it to the hard drive. Is there a way to create a progressive download using javascript (in other words, start the download but keep appending lines as I calculate them)? Or would my only option be to download the file as separate chunks that the user must then join together?