3

I have problem with SHA256 hashing. if the file size is more then 250 MB it is terminating browser and crashing. below is the hashing code please do help us.

let fileReader = new FileReader();
fileReader.readAsArrayBuffer(fileToSend);

  fileReader.onload = (e) => {
  const hash = CrypTo.SHA256(this.arrayBufferToWordArray(fileReader.result)).toString();
  this.hashCode=hash;
  this.fileHistory.MediaHash = hash;
  this.fileHistory.FileName = fileToSend.name;

  //Insert to file history
  this.fileHistoryService.postFiles(this.fileHistory).subscribe(
    data => {
      this.hashCode=data["MediaHash"];
      this.alertService.success('HASHFILE.FileUploadSuccessMessage', true);
      this.hideGenerateHashCodeButton = true;
    },
    error => {
       this.alertService.error('COMMONERRORMESSAGE.SomethingWentWrongErrorMessage');
  });

}

arrayBufferToWordArray(fileResult) {
    var i8a = new Uint8Array(fileResult);
    var byteArray = [];
    for (var i = 0; i < i8a.length; i += 4) {
      byteArray.push(i8a[i] << 24 | i8a[i + 1] << 16 | i8a[i + 2] << 8 | i8a[i + 3]);
    }
    return CrypTo.lib.WordArray.create(byteArray, i8a.length);
  }
Sharad
  • 1,192
  • 1
  • 8
  • 20

2 Answers2

1

Below code I tested for all the big files, which fixed my solution.

var hashdata = CrypTo.algo.SHA256.create();
      var file =**<FiletoHash>**;
          if(file){
            var reader = new FileReader();
            var size = file.size;
            var chunk_size = Math.pow(2, 22);
            var chunks = [];

            var offset = 0;
            var bytes = 0;
        reader.onloadend = (e) =>{
              if (reader.readyState == FileReader.DONE){

               //every chunk read updating hash
                hashdata.update(this.arrayBufferToWordArray(reader.result));

                let chunk:any = reader.result;
                bytes += chunk.length;
                chunks.push(chunk);
                if((offset < size)){
                  offset += chunk_size;
                  var blob = file.slice(offset, offset + chunk_size);
                  reader.readAsArrayBuffer(blob);
                } else {
                  //use below hash for result
                  //finaly generating hash
                  var hash = hashdata.finalize().toString();

                 //debugger;
                };
              }
            };
            var blob = file.slice(offset, offset + chunk_size);
            reader.readAsArrayBuffer(blob);
          }

    }
  }

  arrayBufferToWordArray(fileResult) {
    var i8a = new Uint8Array(fileResult);
    return CrypTo.lib.WordArray.create(i8a, i8a.length);
  }
Sharad
  • 1,192
  • 1
  • 8
  • 20
  • I got the general idea of your algorithm. `bytes` and `chunks` are not used, tho, which is weird. Also, if you store each result in a chunk, aren't you loading the whole file in memory? Why are you binding `onloadend` instead of `onload`? The later removes the need for your first `IF`. – Guillaume F. Jul 10 '20 at 09:12
0

You should definitely use streams or something like it to avoid loading all the file into your memory.

Specifically using CryptoJS, I have seen that it's possible to perform progressive Hashing.

var sha256 = CryptoJS.algo.SHA256.create();
sha256.update("Message Part 1");
sha256.update("Message Part 2");
sha256.update("Message Part 3");
​
var hash = sha256.finalize();

So, use FileReader to read parts of the file, then every time you read a part, you update the sha256 until there is nothing more to read.


See :

filereader api on big files

Orelsanpls
  • 22,456
  • 6
  • 42
  • 69
  • Link you gave I tried, but it is not working in typescript. Can you have any solution for typescript? – Sharad Jan 31 '20 at 13:22
  • I've found theses article, does it help ? : https://gist.github.com/ahoward/4394394 and https://stackoverflow.com/questions/25810051/filereader-api-on-big-files – Orelsanpls Jan 31 '20 at 13:29
  • I tried with the above solution to generate sha256 hash. but it is giving different hash. can any one give me the full solution – Sharad Feb 03 '20 at 13:51
  • Whatever approach NEUT explained is correct, but in typescript i am not able to solve the problem, please do help me for this. – Sharad Feb 03 '20 at 13:52