1

I have stored a json file on blob storage with .gz compression so my file url will look like this https://abcd.blob.core.windows.net/mydir/largefilecompressed.json.gz. So now I need to understand how can i fetch this and decompress to largefilecompressed.json in ReactJs.

The file is 2-3 MB without compression, on low network areas its taking time load and affecting user experience that's the reason why I want to avoid doing this from server

Justin Mathew
  • 950
  • 8
  • 34
  • 1
    I would suggest doing this on the server side via an AJAX/fetch call. – Stephen Collins Jul 05 '21 at 11:14
  • as @StephenCollins already mentioned you can should do that server-side via gunzip, maybe via shell.exec('someSript.sh') or similar (depending on os&backend) and then make the AJAX call from the frontend to fetch the data.. – iLuvLogix Jul 05 '21 at 11:28
  • The JSON file is 2-4 MB, on low network areas its taking a lot of bandwidth to transfer if we do at the server. That's the reason why i am looking for this option. – Justin Mathew Jul 05 '21 at 11:38
  • 2
    Can't you request the data in parts, I doubt you need all 3MB from the start? – Reyno Jul 05 '21 at 11:45
  • The file consists set of lat/long and some data related to each lat long. We are using https://developers.google.com/maps/documentation/javascript/examples/polyline-simple to plot them all on to google map. Even if we manage to do it part by part, missing data part will cause trouble at ui, also managing this at back end is highly complex thing as most of the data is calculated on the fly – Justin Mathew Jul 05 '21 at 12:29
  • @Reyno i am a beginner in ReactJs, Just need to understand is it not possible to do it from ReactJs? – Justin Mathew Jul 05 '21 at 12:30
  • 1
    Does this not address your need? https://stackoverflow.com/questions/14620769/decompress-gzip-and-zlib-string-in-javascript Alternatively, why not put a simple web server in front of that file, so that Chrome could handle decompression automatically? – XML Jul 08 '21 at 13:33
  • First of all, decompression on front-end is not possible. If you have a large file, you can create a maps image on backend share a plotted image to front-end with image map. https://developer.mozilla.org/en-US/docs/Web/HTML/Element/map – Amir Saleem Jul 08 '21 at 13:35
  • When you do this on the server it's also a good idea to not use any gzip libraries and just use the `gzip` command (on Linux), like `gzip -d largefilecompressed.json.gz`. – Zed Jul 10 '21 at 00:02
  • 1
    How about using a package like [Pako](https://github.com/nodeca/pako)? – PsyGik Jul 13 '21 at 17:30
  • Hi all as few of you suggested Pako was the right way to do. Was confused a bit when some members in the community said its not possible straight away! Thanks for the support mates – Justin Mathew Jul 15 '21 at 06:43

2 Answers2

1

There's no reason why you can't decompress a blob stored in react or javascript. You can decompress it and with Web-Assembly, you can do that it nearly native performance.

You can use pako library to do this job.

From it's documentation,

inflate

Decompress data with inflate/ungzip and options. Autodetect format via wrapper header by default.

Here is the sample code, with some json.gz's picked randomly from github.

const urls = [
  'https://raw.githubusercontent.com/NazarHarashchak/MobileBanking/9155a04a3ff064167537a7c32f9cca356a5c3ab4/FrontEnd/node_modules/.cache/eslint-loader/b3fa51dc9159babf532b97696dacb328bf0a70dc.json.gz',
  'https://raw.githubusercontent.com/mongodb-university/mflix-python/d9667e709bd400f3d3dbd6e7f1474b3702d9d5fa/data/mflix/comments.metadata.json.gz',
  'https://raw.githubusercontent.com/dump-sauraj/realme_rmx2185_dump964/3a9c42cac2977a13e43ca8bf1ff886fca730f158/system/system/etc/protolog.conf.json.gz'
]

async function exec(i = 0) {
  console.group('file: ', i);
  try {
    // fetch file with CORS enabled
    const res = await fetch(urls[i], {
      mode: 'cors'
    });
    // convert to arrayBuffer for further processing
    const buf = await res.arrayBuffer();
    // or get blob using `await res.blob()`
    // and convert blob to arrayBuffer using `await blob.arrayBuffer()`

    console.log('input size: ', buf.byteLength);

    // decompress file
    const outBuf = pako.inflate(buf);
    console.log('output size: ', outBuf.byteLength);

    // convert arrayBuffer to string
    const str = new TextDecoder().decode(outBuf);
    // console.log('json string', str);

    // print json object
    console.log('json object', JSON.parse(str));
  } catch (err) {
    console.error('unable to decompress', err);
  }
  console.groupEnd('file: ', i);
}

async function init() {
  for (let i in urls) await exec(i)
}
init()
<script src="https://cdnjs.cloudflare.com/ajax/libs/pako/2.0.3/pako.min.js" integrity="sha512-yJSo0YTQvvGOqL2au5eH0W4K/0FI0sTJuyHjiHkh0O31Lzlb814P0fDXtuEtzOj13lOBZ9j99BjqFx4ASz9pGA==" crossorigin="anonymous" referrerpolicy="no-referrer"></script>

I tried several packages like wasm-flate, wasm-gzip, d5ly and pako but only pako had highest success rate. You can consider them if you feel so.

Edit: added code comments and disabled console.log of full json string

Avinash Thakur
  • 1,640
  • 7
  • 16
1

You can use the npm package pako for this. Assuming you have compressed data from an API or socket, you can decode it first, then decompress it with pako into a string. For example:

// decode the base64 string
const base64Decoded = window.atob(data); // atob is deprecated so you need to prepend window

// convert the decoded string to a Uint8Array
const arr = new Uint8Array(
base64Decoded.split("")
             .map((char) => char.charCodeAt(0)));

// then decompress the data using pako
const data = pako.inflate(arr, { to: "string" });
Huynh Triet
  • 81
  • 3
  • 3