In my application I'm using CSVReader & hibernate to import large amount of entities (like 1 500 000 or more) into database from a csv file. The code looks like this:
Session session = headerdao.getSessionFactory().openSession();
Transaction tx = session.beginTransaction();
int count = 0;
String[] nextLine;
while ((nextLine = reader.readNext()) != null) {
try {
if (nextLine.length == 23
&& Integer.parseInt(nextLine[0]) > lastIdInDB) {
JournalHeader current = parseJournalHeader(nextLine);
current.setChain(chain);
session.save(current);
count++;
if (count % 100 == 0) {
session.flush();
tx.commit();
session.clear();
tx.begin();
}
if (count % 10000 == 0) {
LOG.info(count);
}
}
} catch (NumberFormatException e) {
e.printStackTrace();
} catch (ParseException e) {
e.printStackTrace();
}
}
tx.commit();
session.close();
With large enough files (somewhere around 700 000 lines) I get out of memory exception (heap space).
It seems that the problem is somehow hibernate related, because if I comment just the line session.save(current); it runs fine. If it's uncommented, the task manager shows continuously increasing memory usage of javaw and then at some point the parsing gets real slow and it crashes.
parseJournalHeader()
does nothing special, it just parses an entity based on the String[]
that the csv reader gives.