I am extremely new to javascript/web programming. The main use of my upload form are csv files mostly. I am already using pako to gzip my json (in the request url).
How can I gzip the files before they are sent to the server?
This is roughly how I construct the formdata
$.each($("input[type=file]"), function(i, obj) {
$.each(obj.files, function(j, file) {
formData.append(obj.name, file); // we need to gzip the data
})
});
Edit1: I've managed to (I think) gzip the files using pako, but there's 1 issue - async problems. This is my new code:
$.each($("input[type=file]"), function(i, obj) {
$.each(obj.files, function(j, file) {
formData.append(obj.name, file); // we need to gzip the data
var r = new FileReader();
r.onload = function(){
var zippedResult = pako.gzip(r.result);
var oMyBlob = new Blob(zippedResult, {type : file.type}); // the blob
formData.append(obj.name, oMyBlob); // we need to gzip the data
};
r.readAsArrayBuffer(file);
})
});
// Time to send the formData!
$.ajax({......
As you can see the issue happens since the onload function is only ran after ajax has executed, so the formData is blank
edit2: I'm attempting to create a onchange event for the input files, so this is what I have come up with so far. There is a problem though - it doesn't seems to be zipping correctly. Data type issues?
$("input[type=file]").change(function (event){
var fileList = this.files;
$.each(fileList,function(i,file){
var r = new FileReader();
r.onload = function(){
var zippedResult = pako.gzip(r.result);
var oMyBlob = new Blob(zippedResult, {type : file.type});
app.formData.append(event.target.name, oMyBlob, file.name);
};
r.readAsArrayBuffer(file);
});
});