0

I am calling an REST API endpoint which is hosting a very large amount of data. The data quantity is too much that my chrome tab crashes as well (It displays the data for a short time and it's loading even more data causing the tab to crash). Even postman fails to get the data and instead would only return 200 OK code without displaying any response body.

I'm trying to write a java program to consume the response from the API. Is there a way to consume the response without using a lot of memory?

Please let me know if the question is not clear. Thank you !!

user2430771
  • 1,326
  • 4
  • 17
  • 33
  • What does “consume the response from the API” mean? What do you plan to do with it? – VGR Jan 08 '20 at 19:10
  • @VGR It means make a GET call to the api, process the necessary fields which I need and store the result into my database. – user2430771 Jan 08 '20 at 19:17
  • And what is the type of the API response? Is it XML? – VGR Jan 08 '20 at 19:18
  • 1
    The API that you use does it implement any of these techniques ? https://apievangelist.com/2018/04/20/delivering-large-api-responses-as-efficiently-as-possible/ or https://medium.com/@michalbogacz/streaming-large-data-sets-f86a53e43472 – ralf htp Jan 08 '20 at 19:22
  • Example code (C and Ruby) for consuming from github REST API is in https://developer.github.com/v3/guides/traversing-with-pagination/ in paragraph Consuming the information – ralf htp Jan 08 '20 at 19:33
  • 1
    *Pagination* is most often used for this – ralf htp Jan 08 '20 at 19:42
  • Your question needs to be enhanced with details on the API you are consuming. Maybe the way you're using it led to such issues. To handle large number data, API generally implements one of those strategies: - Reduce Size Pagination - Organizing Using Hypermedia - User Schema Filtering – Katy Jan 08 '20 at 20:00
  • @VGR It's JSON api response – user2430771 Jan 09 '20 at 16:31
  • @ralfhtp Yes I was expecting they do pagination but they don't. That would have really helped me. – user2430771 Jan 09 '20 at 16:32
  • Does this answer your question? [JAVA - Best approach to parse huge (extra large) JSON file](https://stackoverflow.com/questions/9390368/java-best-approach-to-parse-huge-extra-large-json-file) – Volkan Albayrak Jan 09 '20 at 21:55

2 Answers2

2

A possibility is to use a JSON streaming parser like Jackson Streaming API (https://github.com/FasterXML/jackson-docs/wiki/JacksonStreamingApi) for example code see https://javarevisited.blogspot.com/2015/03/parsing-large-json-files-using-jackson.html

For JS there is https://github.com/DonutEspresso/big-json

ralf htp
  • 9,149
  • 4
  • 22
  • 34
1

If data is really so large then better to split task:

  1. Download full data via ordinary http client to disk
  2. Make bulk processing , using some streaming approach, similar to SAX parsing for XML: JAVA - Best approach to parse huge (extra large) JSON file

With such split, you will not deal with possible network errors during processing and will keep data consistency.

Alex Chernyshev
  • 1,719
  • 9
  • 11