0

I am working on the following problem: I want to download an .xlsx file (on click) using SheetJS to create the workbook and FileSaver to save the file. The problem I encounter is, that when exceeding about 100k rows, the file cannot be written. Code snippet of how I build the whole process (mainly taken from sheetjs examples and the following tutorial https://www.youtube.com/watch?v=41rOAt-zCu4):

 let wb = XLSX.utils.book_new()
        wb.Props = {
          Title: 'Test Sheet',
          Subject: 'Test file',
          Author: 'Firstname Lastname'
        }
        // create worksheet in new workbook, write data, use array of arrays
        wb.SheetNames.push('Sheet1')
        // rowData is the array of arrays bigger than 90k
        let wsData = rowData
        let ws = XLSX.utils.aoa_to_sheet(wsData)
        wb.Sheets['Sheet1'] = ws
        let wbout = XLSX.write(wb, {bookType: 'xlsx', type: 'binary'})
        // until here the workbook with the test sheet is created, but not ready to be downloaded by user
        // for saving, we use file-saver. it needs a different format (octet), which can be generated
        // using the s2ab function
        function s2ab (s) {
          let buf = new ArrayBuffer(s.length)
          let view = new Uint8Array(buf)
          for (var i = 0; i < s.length; i++) view[i] = s.charCodeAt(i) & 0xFF
          return buf
        }
        FileSaver.saveAs(new Blob([s2ab(wbout)], {type: 'application/octet-stream'}), 'test.xlsx')

I had the idea of writing several excel-sheets, each with 90000 rows, but I assume, that does not matter because the object I want to write simply is too big for the local memory? And I think the error is produced by the file-saver when producing the octet-format, but I am not sure.

So, I was searching for a solution and I thought of streams. Instead of generating the whole object and then trying to write it, I would like to write while generating the rows and write row by row. This should not use up my memory, right?

Yet I could not find anything on how to do it (never worked with streams before etc). The main problem is memory, since the project has to be offline and has to run on the local RAM/system).

Shushiro
  • 577
  • 1
  • 9
  • 32
  • 1
    I don't know the size of your file and your memory size but the first paragraph in the github of `FileSaver.js` is : `If you need to save really large files bigger then the blob's size limitation or don't have enough RAM, then have a look at the more advanced StreamSaver.js that can save data directly to the hard drive asynchronously with the power of the new streams API. That will have support for progress, cancelation and knowing when it's done writing` – John Apr 19 '19 at 08:52
  • I have dug deeper now and you are correct, the BLOB's size limitation is the problem. So far I am trying other solutions, such as appending. But I will dig into StreamSaver.js aswell, thank you! – Shushiro Apr 23 '19 at 08:16

1 Answers1

0

A solution to download my big amount of data (not fully done yet) was found here:

Javascript: Exporting large text/csv file crashes Google Chrome

currently working on row information (How to export JavaScript array info to csv (on client side)?), but this post should help anybody who stumbles across such a problem

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Shushiro
  • 577
  • 1
  • 9
  • 32