You may be interested in batch inserts on Hibernate tutorial.
The problem is not about the save()
operation because all is doing this operation is to put the object saved in the first-level cache (in the session, making it persistent), but the flush()
operation which triggers the insert. And they recommend the approach below in order to achieve good performance.
It is said also in the tutorial - I haven't tried - that you could get OutOfMemoryException
for very high number of rows made persistent and they seem to recommend 20
as the batch size.
When making new objects persistent flush() and then clear() the session regularly in order to control the size of the first-level cache.
for ( int i=0; i<objectList.size(); i++ ) {
getCurrentSession().save(objectList.get(i));
if ( i % 20 == 0 ) { //20, same as the JDBC batch size
//flush a batch of inserts and release memory:
session.flush();
session.clear();
}
}
EDIT Set also hibernate.jdbc.batch_size
in the configuration file to 20. or 50 if you want. Hibernate will have to group them and instead of 20 inserts you should have only one grouped by 20:
into myobject (id, type) values (id1, val1), (id2, val2), ......(id20, val20)