When an external event occurs (incoming measurement data) an event handler in my Java code is being called. The data should be written to a MySQL database. Due to the high frequency of these calls (>1000 per sec) I'd like to handle the inserts efficiently. Unfortunately I'm not a professional developer and an idiot with databases.
Neglecting the efficiency aspect my code would look roughly like this:
public class X {
public void eventHandler(data) {
connection = DriverManager.getConnection()
statement = connection.prepareStatement("insert …")
statement.setString(1, data)
statement.executeUpdate()
statement.close()
connection.close()
}
}
My understanding is that by calling addBatch() and executeBatch() on statement I could limit the physical disk access to let's say every 1000th insert. However as you can see in my code sketch above the statement object is newly instantiated with every call of eventHandler(). Therefore my impression is that the batch mechanism won't be helpful in this context. Same for turning off auto-commit and then calling commit() on the connection object since that one is closed after every insert.
I could turn connection and statement from local variables into class members and reuse them during the whole runtime of the program. But wouldn't it be bad style to keep the database connection open at all time?
A solution would be to buffer the data manually and then write to the database only after collecting a proper batch. But so far I still hope that you smart guys will tell me how to let the database do the buffering for me.