13

I am using Vuejs along with DataTransfer to upload files asynchronously, and I want to allow multiple files to be dragged and dropped for upload at once.

I can get the first upload to happen, but by the time that upload is done, Javascript has either garbage collected or changed the DataTransfer items object.

How can I rework this (or clone the event/DataTransfer object) so that the data is still available to me throughout the ajax calls?

I've followed the MDN docs on how to use DataTransfer but I'm having a hard time applying it to my specific case. I also have tried copying the event objects, as you can see in my code, but it obviously does not do a deep copy, just passes the reference, which doesn't help.

    methods: {
        dropHandler: function (event) {
            if (event.dataTransfer.items) {
                let i = 0;
                let self = this;
                let ev = event;

                function uploadHandler() {
                    let items = ev.dataTransfer.items;
                    let len = items.length;

                    // len NOW EQUALS 4

                    console.log("LEN: ", len);
                    if (items[i].kind === 'file') {
                        var file = items[i].getAsFile();
                        $('#id_file_name').val(file.name);
                        var file_form = $('#fileform2').get(0);
                        var form_data = new FormData(file_form); 

                        if (form_data) {
                            form_data.append('file', file);
                            form_data.append('type', self.type);
                        }

                        $('#file_progress_' + self.type).show();
                        var post_url = '/blah/blah/add/' + self.object_id + '/'; 
                        $.ajax({
                            url: post_url,
                            type: 'POST',
                            data: form_data,
                            contentType: false,
                            processData: false,
                            xhr: function () {
                                var xhr = $.ajaxSettings.xhr();
                                if (xhr.upload) {
                                    xhr.upload.addEventListener('progress', function (event) {
                                        var percent = 0;
                                        var position = event.loaded || event.position;
                                        var total = event.total;
                                        if (event.lengthComputable) {
                                            percent = Math.ceil(position / total * 100);
                                            $('#file_progress_' + self.type).val(percent);
                                        }
                                    }, true);
                                }
                                return xhr;
                            }
                        }).done((response) => {
                                i++;
                                if (i < len) {

                                    // BY NOW, LEN = 0.  ????

                                    uploadHandler();
                                } else {
                                    self.populate_file_lists();
                                }
                            }
                        );
                    }
                }

                uploadHandler();
            }
        },
Brad
  • 159,648
  • 54
  • 349
  • 530
trpt4him
  • 1,646
  • 21
  • 34
  • 2
    The issue isn't even specific to Vue.js... it's an issue with vanilla JS apps as well. I made a simpler test case to reproduce the issue: https://jsfiddle.net/rjq6b83t/1/ If you use the browser's developer tools, you'll see that the "next loop" doesn't even occur, as the DataTransfer instance seems to be dead by that time. – Brad Dec 07 '19 at 17:29
  • @Brad what about pushing promises into the array and handling them later with `Promise.All`? https://jsfiddle.net/g5h4ajm8/2/ – Temo Tchanukvadze Dec 08 '19 at 13:49
  • @TemoJr. Yeah, that works, I think the key is getting the `entry` before going off the call stack. – Brad Dec 08 '19 at 18:27

3 Answers3

14

Once you call await you're no longer in the original call stack of the function. This is something that would matter particularly in the event listener.

We can reproduce the same effect with setTimeout:

dropZone.addEventListener('drop', async (e) => {
  e.preventDefault();
  console.log(e.dataTransfer.items);
  setTimeout(()=> {
    console.log(e.dataTransfer.items);
  })
});

For example, dragging four files will output:

DataTransferItemList {0: DataTransferItem, 1: DataTransferItem, 2: DataTransferItem, 3: DataTransferItem, length: 4}  
DataTransferItemList {length: 0}

After the event had happened the state has changed and items have been lost.

There are two ways to handle this issue:

  • Copy items and iterate over them
  • Push async jobs(Promises) into the array and handle them later with Promise.all

The second solution is more intuitive than using await in the loop. Also, consider parallel connections are limited. With an array you can create chunks to limit simultaneous uploads.

function pointlessDelay() {
  return new Promise((resolve, reject) => {
    setTimeout(resolve, 1000);
  });
}

const dropZone = document.querySelector('.dropZone');

dropZone.addEventListener('dragover', (e) => {
  e.preventDefault();
});

dropZone.addEventListener('drop', async (e) => {
  e.preventDefault();
  console.log(e.dataTransfer.items);
  const queue = [];
  
  for (const item of e.dataTransfer.items) {
    console.log('next loop');
    const entry = item.webkitGetAsEntry();
    console.log({item, entry});
    queue.push(pointlessDelay().then(x=> console.log(`${entry.name} uploaded`)));
  }
  
  await Promise.all(queue);
});
body {
  font-family: sans-serif;
}

.dropZone {
  display: inline-flex;
  background: #3498db;
  color: #ecf0f1;
  border: 0.3em dashed #ecf0f1;
  border-radius: 0.3em;
  padding: 5em;
  font-size: 1.2em;
}
<div class="dropZone">
  Drop Zone
</div>
Temo Tchanukvadze
  • 1,513
  • 6
  • 16
  • Using `Promise.all` in cases of sending requests potentially can produce a problem with connections limit. For example, if user will try to upload 20 files at once, browser crash a few [overhead requests](https://stackoverflow.com/questions/985431/max-parallel-http-connections-in-a-browser). But of course from other side is quicker than `async ... loop` – Alexandr Tovmach Dec 11 '19 at 19:04
  • 1
    Good point @AlexandrTovmach. In that case, the solution is to split an array into chunks and create a queue. – Temo Tchanukvadze Dec 11 '19 at 19:08
  • ...or just use `async...loop` =) – Alexandr Tovmach Dec 11 '19 at 19:35
  • @AlexandrTovmach if you use `async` loop then you won't have the requests execute on parallel. Splitting the promises array to chunks and using `Promise.all` you get faster results and being safe not hitting any limits. – Christos Lytras Dec 13 '19 at 11:03
  • Yes, I know and I noticed that in previous comment, but splitting queue to chunks it's a bit overhead from code perspective. Just remember: "You're writing code for humans, not for machines" and you don't need to think about "faster results" before you actually have some issues with that. – Alexandr Tovmach Dec 13 '19 at 11:52
  • what about a generator? Something along these lines, if I am understanding the problem domain and this answer correctly... https://medium.com/javascript-scene/the-hidden-power-of-es6-generators-observable-async-flow-control-cfa4c7f31435 – Jacob Dec 14 '19 at 03:17
5

Seems like context of DataTransfer is missing with time. My solution is to copy required data before missing and reuse it when needed:

const files = [...e.dataTransfer.items].map(item => item.getAsFile());

Modified code from jsfiddle of @Brad with my solution:

const dropZone = document.querySelector(".dropZone");
const sendFile = file => {
  const formData = new FormData();
  for (const name in file) {
    formData.append(name, file[name]);
  }
  /**
   * https://docs.postman-echo.com/ - postman mock server
   * https://cors-anywhere.herokuapp.com/ - CORS proxy server
   **/
  return fetch(
    "https://cors-anywhere.herokuapp.com/https://postman-echo.com/post",
    {
      method: "POST",
      body: formData
    }
  );
};

dropZone.addEventListener("dragover", e => {
  e.preventDefault();
});

dropZone.addEventListener("drop", async e => {
  e.preventDefault();
  const files = [...e.dataTransfer.items].map(item => item.getAsFile());
  const responses = [];

  for (const file of files) {
    const res = await sendFile(file);
    responses.push(res);
  }
  console.log(responses);
});
body {
  font-family: sans-serif;
}

.dropZone {
  display: inline-flex;
  background: #3498db;
  color: #ecf0f1;
  border: 0.3em dashed #ecf0f1;
  border-radius: 0.3em;
  padding: 5em;
  font-size: 1.2em;
}
<div class="dropZone">
  Drop Zone
</div>
Alexandr Tovmach
  • 3,071
  • 1
  • 14
  • 31
1

I ran into this problem, and was looking to persist the entire DataTransfer object, not just the items or types, because my asynchronous code's API consumes the DataTransfer type itself. What I ended up doing is creating a new DataTransfer(), and effectively copying over the original's properties (except the drag image).

Here's the gist (in TypeScript): https://gist.github.com/mitchellirvin/261d82bbf09d5fdee41715fa2622d4a6

// https://developer.mozilla.org/en-US/docs/Web/API/DataTransferItem/kind
enum DataTransferItemKind {
  FILE = "file",
  STRING = "string",
}

/**
 * Returns a properly deep-cloned object of type DataTransfer. This is necessary because dataTransfer items are lost
 * in asynchronous calls. See https://stackoverflow.com/questions/55658851/javascript-datatransfer-items-not-persisting-through-async-calls
 * for more details.
 * 
 * @param original the DataTransfer to deep clone
 */
export function cloneDataTransfer(original: DataTransfer): DataTransfer {
  const cloned = new DataTransfer();
  cloned.dropEffect = original.dropEffect;
  cloned.effectAllowed = original.effectAllowed;

  const originalItems = original.items;
  let i = 0;
  let originalItem = originalItems[i];
  while (originalItem != null) {
    switch (originalItem.kind) {
      case DataTransferItemKind.FILE:
        const file = originalItem.getAsFile();
        if (file != null) {
          cloned.items.add(file);
        }
        break;
      case DataTransferItemKind.STRING:
        cloned.setData(originalItem.type, original.getData(originalItem.type));
        break;
      default:
        console.error("Unrecognized DataTransferItem.kind: ", originalItem.kind);
        break;
    }

    i++;
    originalItem = originalItems[i];
  }
  return cloned;
}

You can consume this like so, and then use clone in the same way you originally planned to use evt.dataTransfer:

const clone = cloneDataTransfer(evt.dataTransfer);

M. Irvin
  • 143
  • 1
  • 13