I'm working on a web application using Java, where I have a method that should read a .xlsx file using apache-poi:
public static void xlsx(String arquivo) throws IOException{
try {
FileInputStream file = new FileInputStream(new File(arquivo));
XSSFWorkbook workbook = new XSSFWorkbook(file);
XSSFSheet sheet = workbook.getSheetAt(0);
Iterator<Row> rowIterator = sheet.iterator();
while (rowIterator.hasNext()) {
Row row = rowIterator.next();
Iterator<Cell> cellIterator = row.cellIterator();
while (cellIterator.hasNext()) {
Cell celula = cellIterator.next();
/*here do the reading for each cell,*/
}
}
file.close();
} catch (IOException e) {
e.printStackTrace();
throw new IOException("Erro ao processar arquivo.",e.getCause());
}
}
The method works correctly, however how likely this method will process files with thousands of lines of records, for example, about 25-300 thousand lines. When processing a large file I take the following exception:
(http-localhost-127.0.0.1-8080-4) Servlet.service() for servlet RestServlet threw exception: org.jboss.resteasy.spi.UnhandledException: java.lang.OutOfMemoryError: Java heap space
I need to know how can I avoid this type of error. If you have for example, read and process the file .xlsx 1000 to 1000 lines, or some other solution.