0

I have this existing API running in loop which creates a connection and fetches huge json(1 GB) in each iteration from a service through restTemplate as follow:

ResponseEntity<String> response = restTemplate.exchange(uri.get().toString(), HttpMethod.POST,entity,
                    String.class);

The response is then converted to a complex java object through GSON. The problem with above approach is, rest template converts the inputstream to String through StringBuffer which ends up creating lots of char[] eventually running out of memory (OOM) when the loop is iterated for too long which usually is the case. In place of RestEntity, I even used HttpClient, it too does the same (expand char array).

To solve OOM issue, I refactored the API to stream the data to file. Creating a temp file in each iteration and converting to objects as follow:

File targetFile = new File("somepath\\response.tmp");
FileUtils.copyInputStreamToFile(response.getEntity().getContent(), targetFile);
List<Object> objects = gson.fromJson(reader, new TypeToken<List<Object>>(){}.getType());

is this the way to go or is there any effective approach to solve such problems? Maybe pooling connections instead of creating new in each iteration (will this be considerable change?)

Also, on analysing the API on 4GB xmx, 2GB xms heap, jvisual shows the below:

enter image description here

As it can be seen, the running thread has been allocated huge bytes.

Heap size during the API runtime:

enter image description here

Gautam Tadigoppula
  • 932
  • 11
  • 13
  • What about [this answer](https://stackoverflow.com/a/31275144/10871900)? – dan1st Apr 10 '20 at 22:09
  • How often are you requesting this data? Is it possible to break the JSON object into segments and parse them individually over time and ultimately reconstruct the initial object? Is it possible some of the data isn't necessary? – Jason Apr 10 '20 at 22:15

0 Answers0