92

I want to insert multiple rows into a MySQL table at once using Java. The number of rows is dynamic. In the past I was doing...

for (String element : array) {
    myStatement.setString(1, element[0]);
    myStatement.setString(2, element[1]);

    myStatement.executeUpdate();
}

I'd like to optimize this to use the MySQL-supported syntax:

INSERT INTO table (col1, col2) VALUES ('val1', 'val2'), ('val1', 'val2')[, ...]

but with a PreparedStatement I don't know of any way to do this since I don't know beforehand how many elements array will contain. If it's not possible with a PreparedStatement, how else can I do it (and still escape the values in the array)?

Jonas
  • 121,568
  • 97
  • 310
  • 388
Tom Marthenal
  • 3,066
  • 3
  • 32
  • 47

7 Answers7

192

You can create a batch by PreparedStatement#addBatch() and execute it by PreparedStatement#executeBatch().

Here's a kickoff example:

public void save(List<Entity> entities) throws SQLException {
    try (
        Connection connection = database.getConnection();
        PreparedStatement statement = connection.prepareStatement(SQL_INSERT);
    ) {
        int i = 0;

        for (Entity entity : entities) {
            statement.setString(1, entity.getSomeProperty());
            // ...

            statement.addBatch();
            i++;

            if (i % 1000 == 0 || i == entities.size()) {
                statement.executeBatch(); // Execute every 1000 items.
            }
        }
    }
}

It's executed every 1000 items because some JDBC drivers and/or DBs may have a limitation on batch length.

See also:

BalusC
  • 1,082,665
  • 372
  • 3,610
  • 3,555
  • 29
    Your inserts will go faster if you put them in transactions... i.e. wrap with `connection.setAutoCommit(false);` and `connection.commit();` http://download.oracle.com/javase/tutorial/jdbc/basics/transactions.html – Joshua Martell Dec 04 '10 at 20:39
  • 1
    Looks like you can execute an empty batch if there are 999 items. – djechlin Mar 07 '13 at 21:31
  • 2
    @electricalbah it will execute normally because `i == entities.size()` – Yohanim Mar 31 '17 at 06:43
  • Here's another good resource on putting batch jobs together using prepared statements. https://viralpatel.net/blogs/batch-insert-in-java-jdbc/ – Danny Bullis May 03 '18 at 19:54
  • In this case what does `SQL_INSERT` look like? Something like this `INSERT INTO MyTable (value1, value2) VALUES (?,?)`? – André Jun 04 '18 at 10:11
  • 1
    @AndréPaulo: Just any SQL INSERT suitable for a prepared statement. Refer to the JDBC tutorial links for basic examples. This is not related to the concrete question. – BalusC Jun 04 '18 at 10:13
  • for others, a check of `connection.getMetaData().supportsBatchUpdates();` may be useful if adding support for other database types – Ben Jan 01 '22 at 10:07
34

When MySQL driver is used you have to set connection param rewriteBatchedStatements to true ( jdbc:mysql://localhost:3306/TestDB?**rewriteBatchedStatements=true**).

With this param the statement is rewritten to bulk insert when table is locked only once and indexes are updated only once. So it is much faster.

Without this param only advantage is cleaner source code.

Lokesh Pandey
  • 1,739
  • 23
  • 50
MichalSv
  • 567
  • 5
  • 10
  • this is comment for performence for construction: statement.addBatch(); if ((i + 1) % 1000 == 0) { statement.executeBatch(); // Execute every 1000 items. } – MichalSv May 29 '14 at 16:27
  • Apparently MySQL driver has a bug http://bugs.mysql.com/bug.php?id=71528 This also causes issues for ORM frameworks like Hibernate https://hibernate.atlassian.net/browse/HHH-9134 – Shailendra May 31 '14 at 19:42
  • Yes. This is correct for now too. At least for `5.1.45` mysql connector version. – v.ladynev Feb 27 '18 at 22:37
  • mysql-connector-java 8.0.14 Just checked it is correct of 8.0.14. Without adding `rewriteBatchedStatements=true` there is no performance gain. – vincent mathew May 18 '19 at 03:00
9

If you can create your sql statement dynamically you can do following workaround:

String myArray[][] = { { "1-1", "1-2" }, { "2-1", "2-2" }, { "3-1", "3-2" } };

StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");

for (int i = 0; i < myArray.length - 1; i++) {
    mySql.append(", (?, ?)");
}

myStatement = myConnection.prepareStatement(mySql.toString());

for (int i = 0; i < myArray.length; i++) {
    myStatement.setString(i, myArray[i][1]);
    myStatement.setString(i, myArray[i][2]);
}
myStatement.executeUpdate();
Miss Chanandler Bong
  • 4,081
  • 10
  • 26
  • 36
Ali Shakiba
  • 20,549
  • 18
  • 61
  • 88
  • 2
    I believe the accepted answer is far better!! I didn't know about batch updates and when I was started to writing this answer that answer was not submitted yet!!! :) – Ali Shakiba Dec 04 '10 at 23:20
  • This approach is much faster than accepted one. I test it, but don't find why. @JohnS do you know why? – julian0zzx Nov 21 '12 at 10:27
  • 1
    @julian0zzx no, but maybe beacause it is executed as single sql instead of multiple. but i'm not sure. – Ali Shakiba Nov 23 '12 at 18:36
3

In case you have auto increment in the table and need to access it.. you can use the following approach... Do test before using because getGeneratedKeys() in Statement because it depends on driver used. The below code is tested on Maria DB 10.0.12 and Maria JDBC driver 1.2

Remember that increasing batch size improves performance only to a certain extent... for my setup increasing batch size above 500 was actually degrading the performance.

public Connection getConnection(boolean autoCommit) throws SQLException {
    Connection conn = dataSource.getConnection();
    conn.setAutoCommit(autoCommit);
    return conn;
}

private void testBatchInsert(int count, int maxBatchSize) {
    String querySql = "insert into batch_test(keyword) values(?)";
    try {
        Connection connection = getConnection(false);
        PreparedStatement pstmt = null;
        ResultSet rs = null;
        boolean success = true;
        int[] executeResult = null;
        try {
            pstmt = connection.prepareStatement(querySql, Statement.RETURN_GENERATED_KEYS);
            for (int i = 0; i < count; i++) {
                pstmt.setString(1, UUID.randomUUID().toString());
                pstmt.addBatch();
                if ((i + 1) % maxBatchSize == 0 || (i + 1) == count) {
                    executeResult = pstmt.executeBatch();
                }
            }
            ResultSet ids = pstmt.getGeneratedKeys();
            for (int i = 0; i < executeResult.length; i++) {
                ids.next();
                if (executeResult[i] == 1) {
                    System.out.println("Execute Result: " + i + ", Update Count: " + executeResult[i] + ", id: "
                            + ids.getLong(1));
                }
            }
        } catch (Exception e) {
            e.printStackTrace();
            success = false;
        } finally {
            if (rs != null) {
                rs.close();
            }
            if (pstmt != null) {
                pstmt.close();
            }
            if (connection != null) {
                if (success) {
                    connection.commit();
                } else {
                    connection.rollback();
                }
                connection.close();
            }
        }
    } catch (SQLException e) {
        e.printStackTrace();
    }
}
gladiator
  • 722
  • 1
  • 9
  • 16
3

@Ali Shakiba your code needs some modification. Error part:

for (int i = 0; i < myArray.length; i++) {
     myStatement.setString(i, myArray[i][1]);
     myStatement.setString(i, myArray[i][2]);
}

Updated code:

String myArray[][] = {
    {"1-1", "1-2"},
    {"2-1", "2-2"},
    {"3-1", "3-2"}
};

StringBuffer mySql = new StringBuffer("insert into MyTable (col1, col2) values (?, ?)");

for (int i = 0; i < myArray.length - 1; i++) {
    mySql.append(", (?, ?)");
}

mysql.append(";"); //also add the terminator at the end of sql statement
myStatement = myConnection.prepareStatement(mySql.toString());

for (int i = 0; i < myArray.length; i++) {
    myStatement.setString((2 * i) + 1, myArray[i][1]);
    myStatement.setString((2 * i) + 2, myArray[i][2]);
}

myStatement.executeUpdate();
jrtapsell
  • 6,719
  • 1
  • 26
  • 49
vinay
  • 39
  • 7
  • This is a much faster and better approach in entire answer. This should be the accepted answer – Shankar Guru Nov 09 '17 at 08:25
  • 2
    As mentioned in the accepted answer, some JDBC drivers / databases have limits on the number of rows you can include in an INSERT statement. In the case of the above example, if `myArray` has a greater length than that limit, you'll hit an exception. In my case, I have a 1,000 row limit which elicits the need for a batch execution, because I could be potentially updating more than 1,000 rows on any given run. This type of statement should theoretically work fine if you know you're inserting less than the maximum allowed. Something to keep in mind. – Danny Bullis May 03 '18 at 19:44
  • To clarify, the above answer mentions JDBC driver / database limitations on batch length, but there may also be limits on number of rows included in an insert statement, as I've seen in my case. – Danny Bullis May 03 '18 at 19:52
0

It is possible to submit multiple updates in JDBC.

We can use Statement, PreparedStatement, and CallableStatement objects for batch update with disabled auto-commit.

addBatch() and executeBatch() functions are available with all statement objects to have BatchUpdate.

Here addBatch() method adds a set of statements or parameters to the current batch.

ouflak
  • 2,458
  • 10
  • 44
  • 49
kapil das
  • 2,061
  • 1
  • 28
  • 29
0

This might be helpful in your case of passing array to PreparedStatement.

Store the required values to an array and pass it to a function to insert the same.

String sql= "INSERT INTO table (col1,col2)  VALUES (?,?)";
String array[][] = new String [10][2];
for(int i=0;i<array.size();i++){
     //Assigning the values in individual rows.
     array[i][0] = "sampleData1";   
     array[i][1] = "sampleData2";
}
try{
     DBConnectionPrepared dbcp = new DBConnectionPrepared();            
     if(dbcp.putBatchData(sqlSaveAlias,array)==1){
        System.out.println("Success"); 
     }else{
        System.out.println("Failed");
     }
}catch(Exception e){
     e.printStackTrace();
}

putBatchData(sql,2D_Array)

public int[] putBatchData(String sql,String args[][]){
        int status[];
        try {
            PreparedStatement stmt=con.prepareStatement(sql);
            for(int i=0;i<args.length;i++){
                for(int j=0;j<args[i].length;j++){
                    stmt.setString(j+1, args[i][j]);
                }            
                stmt.addBatch();
                stmt.executeBatch();
                stmt.clearParameters();
            }
            status= stmt.executeBatch();             
        } catch (Exception e) {
            e.printStackTrace();
        } 
        return status;
    }
rapTOR
  • 1
  • 1
  • 5