0

I am using Resttemplate to consume a huge json from an external server. My code works fine when the dataset is not huge but as soon as I run it against the full dataset, I am unable to map the response to the bean class. Below is my code.

public void createCustomCsv(String name, String password, String serverUrl1, String serverUrl2, String propLocation)
SimpleClientHttpRequestFactory requestFactory = new SimpleClientHttpRequestFactory();
requestFactory.setBufferRequestBody(false);

RestTemplate restTemplate = new RestTemplate();
restTemplate.setRequestFactory(requestFactory);

HttpHeaders httpHeaders = customHeaders.createCustomHeaders(name, password);

List<HttpMessageConverter<?>> messageConverters = new ArrayList<HttpMessageConverter<?>>();
 //Add the required converters
messageConverters.add(new MappingJackson2HttpMessageConverter());
messageConverters.add(new StringHttpMessageConverter());
 //Add the message converters to the restTemplate
restTemplate.setMessageConverters(messageConverters); 

ResponseEntity<MyDataBean> responseEntity1;
ResponseEntity<MyDataBean> responseEntity2;
try {
    long startTime = System.currentTimeMillis();
    jLog.debug("Start mapping to Pojo :: " +startTime);
    responseEntity1 = restTemplate.exchange(serverUrl1, HttpMethod.GET, httpEntity, MyDataBean.class);
 responseEntity2 = restTemplate.exchange(serverUrl2, HttpMethod.GET, httpEntity, MyDataBean.class);


MyDataBean sampleDataBeanServer1 = responseEntity1 .getBody();
MyDataBean sampleDataBeanServer2 = responseEntity2 .getBody();


processCustomData(sampleDataBeanServer1 , propLocation);
processCustomData(sampleDataBeanServer2 , propLocation);
} catch (RestClientException e) {

    jLog.debug("Something went wrong with ws call:::" +e);
}
}

I have tried for hours and searched in SO trying to figure out a way to parse the response data into the Pojo class without getting memory exceptions. I understand that this might be because the entire response is being retained in memory to map it to the Pojo. The below code is where the memory problem is found.

 responseEntity1 = restTemplate.exchange(serverUrl1, HttpMethod.GET, httpEntity, MyDataBean.class); 

Based on docs, I have also set the BufferRequestBody to false so that the response doesn't get loaded into memory. I am not sure why the behaviour is different as compared to the docs.

 SimpleClientHttpRequestFactory requestFactory = new  SimpleClientHttpRequestFactory();
 requestFactory.setBufferRequestBody(false);

It would be great if someone is patient and kind to help me out of this soup! I am open to any third party library suggestions as well.

P.N: The above code works perfect when dataset is not huge. Also omitted the bean class code for brevity.

Sid
  • 339
  • 2
  • 12
  • You will likely need to use streaming for your JSON parsing. Take a look at Jackson's Streaming API: http://wiki.fasterxml.com/JacksonStreamingApi. – Chill May 03 '16 at 20:13
  • I have already checked that out. The problem with that approach is that I wouldn't be able to bind it to my model/pojos. Also, the Json structure and the values that I need to fetch from it is quite complicated, hence, using a strongly typed reference in the form of pojos is quite helpful. That being said, I have been trying to think if there is a way to stream this data into the Pojo class without loading the entire response into memory. – Sid May 03 '16 at 20:22
  • Are you sure you would be able to have store pojo in memory in the first place? If the streamed data can't be in memory, there's a good chance the pojo wouldn't fit either (since it contains all the data). – Chill May 03 '16 at 20:39
  • Well, ideally I would like to parse the data using ObjectMapper in a streaming fashion and bind it to my Pojo. I have custom headers that I am passing to the resttemplate (basic auth). Therefore, I am unable to figure a way to stream this data post authentication and then bind it to the model. – Sid May 04 '16 at 13:09
  • @Sid how large your response ? – Rams Jul 03 '19 at 08:06

1 Answers1

0

To handle large files you would have to download the file by reading directly from the response stream. This link might help. And then read from the downloaded file in batches using buffered reader and have them in your entities and then process them.

You wouldn't be be able to process them in one batch as it wouldn't fit in memory.

Community
  • 1
  • 1
basiljames
  • 4,777
  • 4
  • 24
  • 41
  • The server produces this json data like a feed. It is not a file that I can download for processing it locally. – Sid May 03 '16 at 20:28
  • Can you split the entity to a list of entities? One single entity with all the data would not be possible. Google GSON API is worth looking for streaming JSON processing https://sites.google.com/site/gson/streaming – basiljames May 03 '16 at 20:53
  • The model is based on the json format. It is already having multiple entities chained together. The pojo that I am trying to map to is the wrapper model class. As mentioned earlier, the problem is with processing or mapping this huge data efficiently by not loading it fully into memory. – Sid May 04 '16 at 05:21