0

I am new to JS, and I need to load a file1, decompress a part of it to file2, and then make that decompressed file2 available to user's download--all completely browser-side (no Node.js etc.).

For decompression I have:

let fb;

const decB = document.querySelector('button[id="dec"]')
const inputB = document.querySelector('input[type="file"]')
    
input.addEventListener('change', function(e) {
    

    const r = new FileReader()
    r.onload = function () {
        const archive = new Uint8Array(r.result, start, length)
        try {
            fb = pako.inflate(archive);
         
          } catch (err) {
            console.log(err);
          }
    }
    r.readAsArrayBuffer(input.files[0])
}, false)

decB.addEventListener("click", function(e) {
  try {
    const t = new TextDecoder().decode(fb)
    console.log(t)
    
  } catch(err) {
    console.log(err)
  }
}, false)

I want to be able to access the contents of the result in other functions. Is using a global variable the best way to do it, or is there a more proper solution?

InfiniteLoop
  • 387
  • 1
  • 3
  • 18
  • If you declare `result` outside the function, it definitely should be visible inside the function. Show the code you tried and the error you got. – Barmar Jul 25 '21 at 05:19
  • You probably should read https://stackoverflow.com/questions/23667086/why-is-my-variable-undefined-after-i-modify-it-inside-of-a-function-asynchron and https://stackoverflow.com/questions/14220321/how-to-return-the-response-from-an-asynchronous-call – Barmar Jul 25 '21 at 05:21
  • Thank you so much. Turns out I had an error that I kept missing. I corrected it, and `result` now behaves as it should. I re-phrased the question because I am curious if this approach of using a global variable for the file buffer is optimal, or if there is a more elegant way to do it. Thank you! – InfiniteLoop Jul 25 '21 at 05:50

2 Answers2

1

Here is a tiny dependency free variant

function decompressBlob(blob) {
  const ds = new DecompressionStream('gzip');
  const decompressedStream = blob.stream().pipeThrough(ds);
  return new Response(decompressedStream).blob();
}

function compressBlob(blob) {
  const ds = new CompressionStream('gzip');
  const decompressedStream = blob.stream().pipeThrough(ds);
  return new Response(decompressedStream).blob();
}

const file = new File(['abc'.repeat(100)], 'filename.txt')

console.log('original file size', file.size)

compressBlob(file).then(async newBlob => {
  console.log('compressed blob size:', newBlob.size)
  
  const decompressedBlob = await decompressBlob(newBlob)
  const content1 = await decompressedBlob.text()
  const content2 = await file.text()
  const expected = 'abc'.repeat(100)

  console.log('same content:', content1 === expected)
  console.log('same content:', content2 === expected)
})

Then if you want to download it create a object url and attach it to a link with a download attribute

a = document.createElement('a')
a.href = Object.createObjectURL(blob)
a.download = originalFile.name + '.gz'
a.click()
Endless
  • 34,080
  • 13
  • 108
  • 131
  • Really cool stuff, thank you! However, my scripts needs to be compatible with non-blink browsers as well. Hence the dependency. Also, I am specifically curious about the best modular design for when you have event listeners--the best way to compartmentalize the code so as to not have a giant block inside the event listener's nested callback function. I upvoted you, but I'd like to see if anybody has suggestions for what I have described. Maybe you could demonstrate the best way to do it with the two event listeners I have in my code? It's important for me as a learning opportunity. Thank you! – InfiniteLoop Jul 25 '21 at 11:56
0

I guess if you want to avoid callbacks and not having a giant code block, then you can try to use async/await instead along with the new promise based reading methods on the blob itself (https://developer.mozilla.org/en-US/docs/Web/API/Blob/arrayBuffer)

input.addEventListener('change', async evt => {
  const [file] = input.files
  if (file) {
    const arrayBuffer = await file.slice(28, 7412).arrayBuffer()
    const compressed = new Uint8Array(arrayBuffer)
    fileBuffer = pako.inflate(compressed)
    document.getElementById('Decompress').disabled = false
  } else {
    // input was cleared
  }
})
Endless
  • 34,080
  • 13
  • 108
  • 131