2

When an external event occurs (incoming measurement data) an event handler in my Java code is being called. The data should be written to a MySQL database. Due to the high frequency of these calls (>1000 per sec) I'd like to handle the inserts efficiently. Unfortunately I'm not a professional developer and an idiot with databases.

Neglecting the efficiency aspect my code would look roughly like this:

public class X {

    public void eventHandler(data) {
        connection = DriverManager.getConnection()
        statement = connection.prepareStatement("insert …")
        statement.setString(1, data)
        statement.executeUpdate()
        statement.close()
        connection.close()
    }
}

My understanding is that by calling addBatch() and executeBatch() on statement I could limit the physical disk access to let's say every 1000th insert. However as you can see in my code sketch above the statement object is newly instantiated with every call of eventHandler(). Therefore my impression is that the batch mechanism won't be helpful in this context. Same for turning off auto-commit and then calling commit() on the connection object since that one is closed after every insert.

I could turn connection and statement from local variables into class members and reuse them during the whole runtime of the program. But wouldn't it be bad style to keep the database connection open at all time?

A solution would be to buffer the data manually and then write to the database only after collecting a proper batch. But so far I still hope that you smart guys will tell me how to let the database do the buffering for me.

Mike
  • 23,542
  • 14
  • 76
  • 87

2 Answers2

1

I could turn connection and statement from local variables into class members and reuse them during the whole runtime of the program. But wouldn't it be bad style to keep the database connection open at all time?

Considering that most (database-)connection pools are usually configured to keep at least one or more connections open at all times, I wouldn't call it "bad style". This is to avoid the overhead of starting a new connection on each database operation (unless necessary, if all the already opened connections are in use and the pool allows for more).

I'd probably go with some form of batching in this case (but of course I don't know all your requirements/environment etc). If the data doesn't need to be immediately available somewhere else, you could build some form of a job queue for writing the data, push the incoming data there and let other thread(s) take care of writing it to database in N size batches. Take a look what classes are available in the java.util.concurrent-package.

esaj
  • 15,875
  • 5
  • 38
  • 52
0

I would suggest you use a LinkedList<> to buffer the data(like a queue), then store the data into the dbms as and when required in a separate thread, executed at regular intervals(maybe every 2 seconds?)

See how to construct a queue using linkedlist in java

Community
  • 1
  • 1
Aniket Inge
  • 25,375
  • 5
  • 50
  • 78