I'm trying to read a very large gzipped csv file in node.js. So far, I've been using zlib for this:
file.createReadStream().pipe(zlib.createGunzip()
is the stream I pass to Papa.parse. This works fine for most files, but it fails with a very large gzipped CSV file (250 MB, unzips to 1.2 GB), throwing this error:
Error: incorrect header check
at Zlib.zlibOnError [as onerror] (zlib.js:180:17) {
errno: -3,
code: 'Z_DATA_ERROR'
}
Originally I thought it was the size of the file that caused the error, but now I'm not so sure; maybe it's because the file has been encrypted using a different algorithm. zlib.error: Error -3 while decompressing: incorrect header check suggests passing either -zlib.Z_MAX_WINDOWBITS
or zlib.Z_MAX_WINDOWBITS|16
to correct for that, but I tried it and that's not the problem.