We've been having problems with our app where it runs out of memory when producing a CSV file. Specifically on big CSV files where there are more than 10k rows. We are using Spring Boot 2.0.8, and SuperCSV 2.4.0.
What would be the correct approach to handle these cases, so that our Spring MVC API does not crash due to OutOfMemoryException
.
Would SuperCSV be the cause of this problem? I'd imagine it's not but just in case.
I have been reading about @Async
, would it be a good idea to use it on this method as to open a separate thread?
Suppose I have the following method in a controller:
@RequestMapping(value = "/export", method = RequestMethod.GET)
public void downloadData(HttpServletRequest request,HttpServletResponse response) throws SQLException, ManualException, IOException, NoSuchMethodException, InvocationTargetException, IllegalAccessException {
List<?> data = null;
data = dataFetchService.getData();
ICsvBeanWriter csvWriter = new CsvBeanWriter(response.getWriter(), CsvPreference.STANDARD_PREFERENCE);
//these next lines handle the header
String[] header = getHeaders(data.get(0).getClass());
String[] headerLocale = new String[header.length];
for (int i = 0; i < header.length; i++)
{
headerLocale[i] = localeService.getLabel(this.language,header[i]);
}
//fix for excel not opening CSV files with ID in the first cell
if(headerLocale[0].equals("ID")) {
//adding a space before ID as ' ID' also helps
headerLocale[0] = headerLocale[0].toLowerCase();
}
csvWriter.writeHeader(headerLocale);
//the next lines handle the content
for (Object line : data) {
csvWriter.write(line, header);
}
csvWriter.close();
response.getWriter().flush();
response.getWriter().close();
}