7

We have an REST API emitting JSON from where we can source data (using AJAX). What we want to do is to be able to do AJAX calls where each call fetches a small chunk of data (say 5000 rows or so), and then have it written out to a CSV file on the browser as a download.

It seems that if we have all the data in JS memory, then writing it out to CSV file is not hard, but if we want to write out 100K records, then we are forced to fetch all 100K in one shot and then produce the file in one go.

Instead, we feel it would be much gentler on both server and client to download small chunks and stream it out to the file download. Is there a way to do it?

(We are presently using jQuery 2.0.0, but don't mind using a different library to get this done)

Shreeni
  • 3,222
  • 7
  • 27
  • 39
  • why not use a hidden iframe for downloading the stream as a file? http://stackoverflow.com/questions/16799483/using-jquery-and-iframe-to-download-a-file – Henrik Jun 29 '15 at 06:52
  • @Henrik That works, AFTER all I have all contents in the JS memory. I want to do cycles of fetch->writeout-to-file->fetch. – Shreeni Jun 29 '15 at 07:16
  • ? it never enters the JS memory.. JS simply initiates the download, by pointing the the iframe to the source. The browser takes care of the download, and the server takes care of chunking the file into network packages - or have I misunderstood the question completely? http://stackoverflow.com/questions/3749231/download-file-using-javascript-jquery – Henrik Jun 29 '15 at 07:49
  • The API serves only JSON and we need to process it in JS before sending it out to CSV file. Additionally a single round of the API pull will only return a limited set of records (5K), but the overall data might be say 100K. – Shreeni Jun 30 '15 at 00:03
  • Additionally, we don't have the ability to modify the APU server (to chunk packages and/or change output format).. – Shreeni Jun 30 '15 at 00:03
  • ok, I somehow missed that (only json) in the question. – Henrik Jun 30 '15 at 06:18
  • @Henrik Just realised that JSON-bit wasn't in the question, edited now.. – Shreeni Jun 30 '15 at 09:54
  • ok, then I would claim its impossible at the moment. The only thing I know of that comes close is this: http://eligrey.com/demos/FileSaver.js/, for more explanations, and answers, see theese semi-related questions: http://stackoverflow.com/questions/23702157/download-file-client-side-chunk-by-chunk, http://stackoverflow.com/questions/19287586/save-client-generated-data-as-file-in-javascript-in-chunks. – Henrik Jun 30 '15 at 10:48
  • @Henrik Thanks. I was arriving at the same conclusion myself. For the time being, we plan to create a server side proxy script that will talk to the JSON API and then stream it out into a CSV. On the client side, we are planning to open a connection to that server side script as a file download. – Shreeni Jul 02 '15 at 08:01

1 Answers1

1

Basically you are looking for paging of record...for this you can do

  1. Find total number of records in database
  2. Than divide your records/no of calls
  3. Merge all the data coming from different calls

for making multiple call using jquery you can do like this

$.when( $.ajax( "/page1.php" ), $.ajax( "/page2.php" ) ).done(function( a1, a2 ) {
  // a1 and a2 are arguments resolved for the page1 and page2 ajax requests, respectively.
  // Each argument is an array with the following structure: [ data, statusText, jqXHR ]
  var data = a1[ 0 ] + a2[ 0 ]; // a1[ 0 ] = "Whip", a2[ 0 ] = " It"
  if ( /Whip It/.test( data ) ) {
    alert( "We got what we came for!" );
  }
});

in code there are multiple ajax call and at last it merging data ...that same thing on server side you have to do if you are using C# than TPL (Task parellel library) is good option...for each call you need to call with the page number

Pranay Rana
  • 175,020
  • 35
  • 237
  • 263
  • This would still need me to effectively store all contents in the JS memory, which is what I want to avoid in the first place. – Shreeni Jun 29 '15 at 07:14