0

I am using fetch API to get a response from the server URL and the response size is not every time.

Most of the time I get response data of size 262144 but sometimes the size is less than that. Like data of size 65536 and 196608.

async function fetchData() {
  let url = "https://www.dl.dropboxusercontent.com/s/7d87jcsh0qodk78/fuel_64x64x64_uint8.raw?dl=1";
  let response = await fetch(url);
  let data = await response.body.getReader().read();
  data = data.value;
  if (data) {
    dataBuffer = new Uint8Array(data);
    console.log(data.length);
  } else {
    console.log("action aborted");
  }
}

fetchData()
Pravin Poudel
  • 1,433
  • 3
  • 16
  • 38
  • Did you test this with multiple urls – B''H Bi'ezras -- Boruch Hashem Aug 21 '20 at 10:07
  • No, but I think there is no problem with URL because others are using this URL and they don't have any issue with this. I will check with other URLs too. – Pravin Poudel Aug 21 '20 at 10:11
  • 1
    try it with multiple urls, see if you get the same size each time. If so, it's a problem with the URL. Did you try printing the result of the response to make sure it's the same response each time? I know Dropbox has limits to how many times a certain ip address can download even the direct download files, perhaps you went over the limit – B''H Bi'ezras -- Boruch Hashem Aug 21 '20 at 10:13
  • I don't think I could be able to read that data because this data is 3D data set and I don't know how to check that response data – Pravin Poudel Aug 21 '20 at 10:28

2 Answers2

2

The reader you receive from getReader() works with an internal queue. The documentation of read() says the following:

The read() method of the ReadableStreamDefaultReader interface returns a promise providing access to the next chunk in the stream's internal queue.

Fetching multiple times may end up chunking the remote data differently, which in turn results in the different lengths. To read the stream to completion check the done value returned by read().

async function fetchData() {
  const url = "https://www.dl.dropboxusercontent.com/s/7d87jcsh0qodk78/fuel_64x64x64_uint8.raw?dl=1";
  const response = await fetch(url);
  const reader = response.body.getReader();
  
  let length = 0;
  let value, done;
  while ({value, done} = await reader.read(), !done) {
    length += value.length;
  }
  console.log(length);
}

fetchData()

However if the intent is to read the stream to completion before taking any action, you might as well use one of the response methods like arrayBuffer(), blob(), formData(), json() or text() depending on the type of data you are expecting. The mentioned response methods all read the response stream to completion.

async function fetchData() {
  const url = "https://www.dl.dropboxusercontent.com/s/7d87jcsh0qodk78/fuel_64x64x64_uint8.raw?dl=1";
  const response = await fetch(url);
  const dataBuffer = new Uint8Array(await response.arrayBuffer());
  
  console.log(dataBuffer.length);
}

fetchData()
3limin4t0r
  • 19,353
  • 2
  • 31
  • 52
  • @3limin4tor Out of curiosity, why should we use `.arrayBuffer()` instead of `.blob()`? – ihodonald Aug 21 '20 at 10:59
  • 1
    @ihodonald That depends on what you want to do with the data. I used `arrayBuffer()` here to create an `Uint8Array`, to match the types used in the question. `arrayBuffer()` and `blob()` both have their own uses, and depending on what you plan to do with the data you pick one over the other. See: [What is the difference between an ArrayBuffer and a Blob?](https://stackoverflow.com/questions/11821096/what-is-the-difference-between-an-arraybuffer-and-a-blob) – 3limin4t0r Aug 21 '20 at 11:05
  • I remember reading this when I read the documentation but I missed that reader read chunk when I implemented it. Thanks, a lot and regards for the great answer. – Pravin Poudel Aug 21 '20 at 11:45
1

@3limin4t0r's answer is correct. The issue does not arise from the file being compressed by Dropbox, but rather due to the fact that the .read() method returns a byte stream, along with a promise that...

"provid[es] access to the next chunk in the stream's internal queue."

The value that you are retrieving is sometimes only the first chunk in the stream/queue. You must wait until the reader is finished reading the stream.

ihodonald
  • 745
  • 1
  • 12
  • 27
  • I don't have knowledge of what dropbox use to optimize but if that is the case why does this doesn't occur most of the time. Most of the time it is the correct size but sometimes it does not fetch correct size. – Pravin Poudel Aug 21 '20 at 10:26
  • @pravinpoudel Have you tried opening the image? It uses a `codec` (short for compressor/decompressor) that is synonymous with a high-resolution camera but the file size is very small for that codec. This is like Dropbox saying, "this .raw image file doesn't contain enough data to be able to be 262KB so I'm going to send you an 8KB .raw image with nothing but black in it". – ihodonald Aug 21 '20 at 10:32
  • @pravinpoudel It's just my guess, but I think it has something to do with the codecs and the fact that the image doesn't contain a lot of data. – ihodonald Aug 21 '20 at 10:33
  • @pravinpoudel @3limin4t0r has the answer. It's the `Uint8Array()` constructor that is compressing the file. – ihodonald Aug 21 '20 at 10:42
  • @ihodonald That data is not compressed. The problem is that one `read()` only read one chunk from the stream. This might or might not be all the data. Depending on how the remote data is chunked internally. – 3limin4t0r Aug 21 '20 at 11:15
  • @3limin4tor Yeah, I thought about deleting my answer. Should I? – ihodonald Aug 21 '20 at 11:20
  • @ihodonald That's up to you. Re-read the question, then read your answer. Do you find your answer useful? Leave it up. If not, remove or update it. – 3limin4t0r Aug 21 '20 at 11:25
  • 1
    Keep being nice. +1 for your effort and your willingness to help others!!! – Pravin Poudel Aug 21 '20 at 14:50