0

My Spring boot app is using Hibernate JPA to save records to MySQL.

After working good and fast, the save() suddenly started to work very very slowly: about 10 minutes for inserting 1K records (the primary key is the ID).

I though the problem was in calling the save() in a loop, so I changed to saveAll(), but to my surprise there is no difference and it can take even 20 minutes to save 1139 records.

The MySQL is local and is used only by my dev env.

Here is the log file:

2018-08-07 14:44:25:335 - About to save all new 1139 cats of Oscar 
2018-08-07 15:03:47:758 - Succeded saving the cats of Oscar
2018-08-07 15:07:41:961 - Session Metrics {
    600922575 nanoseconds spent acquiring 1140 JDBC connections;
    92320112 nanoseconds spent releasing 1139 JDBC connections;
    312825484 nanoseconds spent preparing 2283 JDBC statements;
    464811479497 nanoseconds spent executing 2279 JDBC statements;
    233368433210 nanoseconds spent executing 229 JDBC batches;
    0 nanoseconds spent performing 0 L2C puts;
    0 nanoseconds spent performing 0 L2C hits;
    0 nanoseconds spent performing 0 L2C misses;
    233765441110 nanoseconds spent executing 1 flushes (flushing a total of 1139 entities and 0 collections);
    40198 nanoseconds spent executing 1 partial-flushes (flushing a total of 0 entities and 0 collections)
}

I added these to the properties file:

spring.jpa.properties.hibernate.jdbc.batch_size=5
spring.jpa.properties.hibernate.order_inserts=true
spring.jpa.properties.hibernate.order_updates=true
spring.jpa.properties.hibernate.generate_statistics=true

Here is the code in the Service method:

    log.debug("About to save all new {} cats of {} ", allCats.size(), owner);
    catsRepository.saveAll(allCats);
    log.debug("Succeded saving the cats of {}", owner);
riorio
  • 6,500
  • 7
  • 47
  • 100
  • Probably a duplicate question. Check this : https://stackoverflow.com/questions/15893156/best-way-to-save-the-large-number-of-records-in-hibernate – Rahul Babu Aug 07 '18 at 13:12

1 Answers1

0

The batch size will not have an effect.

Here the important part from the hibernate doc: http://docs.jboss.org/hibernate/orm/5.2/userguide/html_single/Hibernate_User_Guide.html#batch-session-batch-insert

Batch inserts

When you make new objects persistent, employ methods flush() and clear() to the session regularly, to control the size of the first-level cache.

Example 3. Flushing and clearing the Session

EntityManager entityManager = null;
EntityTransaction txn = null;
try {
    entityManager = entityManagerFactory().createEntityManager();

    txn = entityManager.getTransaction();
    txn.begin();

    int batchSize = 25;

    for ( int i = 0; i < entityCount; i++ ) {
        if ( i > 0 && i % batchSize == 0 ) {
            //flush a batch of inserts and release memory
            entityManager.flush();
            entityManager.clear();
        }

        Person Person = new Person( String.format( "Person %d", i ) );
        entityManager.persist( Person );
    }

    txn.commit();
} catch (RuntimeException e) {
    if ( txn != null && txn.isActive()) txn.rollback();
        throw e;
} finally {
    if (entityManager != null) {
        entityManager.close();
    }
}
Simon Martinelli
  • 34,053
  • 5
  • 48
  • 82