1

I'm supposed to do a stress testing on my application for which I need 60 million records in a table. The approach I'm following is, first I'm initializing the instance of Connection class, then inserting the data using a for loop by PreparedStatement using that Connection, and after every 500 rows insertion I'm committing the connection and resetting it.

Using this, I'm able to insert approx 150 records per minute. I just want to know if there's any more optimized way of doing it so that time required for execution is minimal

Vishal Afre
  • 1,013
  • 3
  • 12
  • 39
  • 1
    Investigate batch inserts or load the data using the Oracle utilities instead of your code. – Jim Garrison Nov 11 '16 at 20:04
  • Committing in a loop is not going to improve performance in Oracle - the opposite is the case. Only commit once at the end of your transaction. –  Nov 11 '16 at 21:58
  • But 150 rows per minute is ***way*** to slow. I can get about 1 million per minute with a local server (narrow table with 6 columns) –  Nov 11 '16 at 22:08
  • Did you try Bulk Insertion Operation?It will insert very quickly compared with batch insert. – Mister X Nov 14 '16 at 12:01

1 Answers1

3

Use jdbc batch inserts:

PreparedStatement ps = conn.prepareStatement(sql);
int count = 0;
for(loop construct here) {
    count++;
    ps.setString(1,someString);
    // ... and so forth
    ps.addBatch();

    if (count%500 ==0)
        ps.executeBatch()
}
ps.executeBatch();
Gurwinder Singh
  • 38,557
  • 6
  • 51
  • 76
  • 2
    Hi! do i need to execute ps.clearBatch(); after the execute? – Mariano L Jul 30 '18 at 17:56
  • @MarianoL No, you shouldn't need to do that. `executeBatch` should flush the last batch. https://stackoverflow.com/questions/58522625/does-executebatch-also-clear-the-list-of-commands – Gurwinder Singh Mar 03 '22 at 16:13