I have a CSV file containing customer info, one customer each row.
The CSV file has a size of about 170,000 lines.
The app first parsed the whole file line by line and saved each line as a Customer object into an ArrayList
. It implied that the size of the list would also be in the order of 170k.
The code is like the below:
final class CustomerInfoLineProcessor implements LineProcessor<CustomerInfo> {
...
@Override
public boolean processLine(final String line) {
parseLine(line);
return true;
}
private void parseLine(final String line) {
try {
if (!line.trim().isEmpty()) {
//do job
}
} catch (final RuntimeException e) {
handleLineError(e.getClass().getName() + ": " + e.getMessage(), e, lineStatus);
}
}
...
}
It was found intermittently that the parsing process ended abnormally in the middle. No errors or runtime exceptions were thrown. The whole process was also not stopped. The app kept doing further jobs based on whatever inside the ArrayList
.
In the beginning, I thought there might be some invisible characters hidden somewhere in the file, which caused the process quit early. But the possibility was excluded after the same file was tested without any problem by the same app on my test machine.
The second guess was: the memory setting -Xmx256m was too small, thus I changed it to an even smaller one, -Xmx128m. The app immediately threw an OutOfMemoryError
, and the app was terminated automatically. It implied that the memory usage of -Xmx256m seemed not to be an issue.
Any other reasons I have not yet thought about?