0

I have a request that returns a large JSON:

function fetchDataFromApi(request, params) {

var url = 'https://someurl/get-records?' + 
              'school=someSchool&' +
              'year=';
    var yearToday = parseInt(new Date().getFullYear());

    var requests = [];
    for (var i = 2011; i < yearToday; i++)
    {
      requests.push(url + i);
    }

    return JSON.parse(UrlFetchApp.fetchAll(requests).map(function(e) { return e.getContentText()}));

}

I have sliced it by year so that UrlFetchApp won't have to deal with the 50MB blob limit. However it throws an error:

 Exception details: InternalError: Array length 53673928 exceeds supported capacity limit.

But when I try to access each element of the response, for example

JSON.parse(UrlFetchApp.fetchAll(requests)[0].getContentText()

...it works fine.

Any idea on what causes the issue?

I have also tried dividing all responses into separate requests.

var responses = []
for (var i = 2011; i < yearToday; i++)
{
  var request = url + i;
  responses.push(UrlFetchApp.fetch(request).getContentText());
}

var json = JSON.parse([].concat.apply([], responses));
return json;

but this error is being thrown in Google Data Studio:

Array length 68522537 exceeds supported capacity limit.
Dale Julian
  • 1,560
  • 18
  • 35
  • Have you tried setting each response to a different array? What are you doing to the response? If, for example, you are setting the response into a spreadsheet you can do it each time and delete the response before making another call. – Kessy Jul 16 '21 at 09:34
  • I have tried that as well except for the "delete the response" part. What difference does it make? I need to load all responses to Data Studio. – Dale Julian Jul 20 '21 at 00:17

1 Answers1

1

If you would like to work around the 50 MB limit, this is not the way to go

Even if you map the response, you have to retrieve it in full first. See also this post.

Thus, you might want (and need) to perform a separate fetch call for each of your requests - thus splitting the size of the response blob into individual blobs.

    var responses = [];
    for (var i = 2011; i < yearToday; i++)
    {
      var request = url + i;
      responses.push(JSON.parse(UrlFetchApp.fetch(request).getContentText()));
    }
    return responses;

Depending on the size of each request response, you might also do it in batches - e.g. a fetchAll for 2-3 requests at a time.

ziganotschka
  • 25,866
  • 2
  • 16
  • 33
  • I've also done this, but "Expected end of stream at char x" error is being thrown. – Dale Julian Jul 21 '21 at 02:31
  • Is it because you are logging the content text? See if [this](https://stackoverflow.com/questions/52448235/troubleshooting-google-app-script-json-conversion-expected-end-of-stream-at-ch) helps. – ziganotschka Jul 21 '21 at 07:12
  • No, I don't log it to minimize the processes. Please see the edit of the original post for the other solution I tried. – Dale Julian Jul 21 '21 at 07:32
  • What if instead of pushing the responses into a common array you try for debuging purposes to write them one by one into a spreadsheet? – ziganotschka Jul 21 '21 at 08:46
  • I already tried returning only one year's getContentText() as a response. It works. But if I try to get all responses getContentText(), the error occurs. – Dale Julian Jul 21 '21 at 10:26
  • Do you need the whole contentText or only a specific object from the response? – ziganotschka Jul 21 '21 at 11:27
  • I need the whole contentText since I'm loading everything to Data Studio. – Dale Julian Jul 21 '21 at 12:28
  • If your response is so large that it exceeds allowed limits, I am not sure if there is something you can do. If it helps - please read also [this](https://stackoverflow.com/questions/52448235/troubleshooting-google-app-script-json-conversion-expected-end-of-stream-at-ch) – ziganotschka Jul 23 '21 at 09:15