From spring boot application while reading 1 million data from CVS file getting errors show below.
java.net.SocketTimeoutException: null
at org.apache.tomcat.util.net.NioBlockingSelector.read(NioBlockingSelector.java:201) ~[tomcat-embed-core-8.5.16.jar:8.5.16]
java.lang.OutOfMemoryError: GC overhead limit exceeded
2020-03-29 23:05:22.886 ERROR 10772 --- [nio-8080-exec-6] o.a.c.c.C.[Tomcat].[localhost] : Exception Processing ErrorPage[errorCode=0, location=/error]
org.apache.catalina.connector.ClientAbortException: java.io.IOException: An established connection was aborted by the software in your host machine.
Below is java Code
@Transactional
private void readFileFromFile(FileUploadDetails fileUploadDetail) throws IOException {
String filePath = fileUploadDetail.getFilePath();
try {
FileReader filereader = new FileReader(filePath);
CSVReader csvReader = new CSVReaderBuilder(filereader)
.withSkipLines(1)
.build();
List<FileAnnualEnterpriseSurveyLog> fileAnnualEnterpriseSurveyLog = new ArrayList<>();
// we are going to read data line by line
List<String[]> allData = csvReader.readAll();
int count=0;
for (String[] cell : allData) {
FileAnnualEnterpriseSurveyLog fileAESurveyLogVo = new FileAnnualEnterpriseSurveyLog();
fileAESurveyLogVo.setYear(cell[0].toString());
...............................................
fileAnnualEnterpriseSurveyLog.add(fileAESurveyLogVo);
//fileAnnualEnterpriseSurveyRepository.save(fileAESurveyLogVo);
//fileAnnualEnterpriseSurveyRepository.
System.out.println(count);
count++;
}
System.out.println("waseem"+fileAnnualEnterpriseSurveyLog.size());
fileAnnualEnterpriseSurveyRepository.save(fileAnnualEnterpriseSurveyLog);
} catch (FileNotFoundException e) {
e.printStackTrace();
}
}
So could you please suggest me this a good approach to read large amount of data