I want to collect some user actions from a web site, which will send an AJAX request to a tracking server on most of the page visits, which then in turn extracts the relevant data structure and writes it to some database tables. As the website is running under heavy load (esprecially before christmas), there are many views of the relavant pages, which results in many requests to the tracking server.
If a request comes in and I write the data to the database instantly, I will certainly run into trouble, e.g.
- Even if using a connection pool, I will run out of unsed connections quite quickly and waiting times in the pool's queue will start to increase more and more
- Concurrent access to the database might cause problems with table locking, as all data are written to the same table and query times will start to increase
However the advantage is that I don't need to care about synchronization.
Thus I want to collect a batch of data (e.g. 100 objects) and send them to the database to take off some load. This solves the above problem because
- there is only one connection
- there is no concurrency at database table level
The problem is that I now would need to properly synchronize the underlying buffer mechanism. As the tracking server and the database are running on the same server, this should greatly improve overall performance.
I am looking for some class or library for Java 8 which can buffer the objects in a FIFO manner and automatically writes them to the database if a certain minimum size is reached. The library should be able to handle concurrency as good and fast as possible, as it might occur that two items are stored into the cache at the same time.
Do you know such a library where I can hook up my persistence layer myself or just use JDBC?
My best guess is that I could use a flavour of
java.util.concurrent.BlockingQueue
, e.g. LinkedBlockingQueue.drainTo(Collection, maxElements)
in combination with a timer which periodically polls the length of the queue. Does anyone have a better suggestion?