I have an app where upon pressing a Start button, a service will begin that polls a few sensors, stores the sensor data into some object whenever the sensor values change. Every 10ms, a database insert occurs that takes the objects current values and stores them into the database. This happens for 30 minutes
Given the speed and duration of insertion, I want to run this in a separate thread from the UI thread so navigation doesn't take a hit. So my service will offer some data to the thread by adding it to a queue, then the other thread (consumer) will take from the queue and insert into the database
When the Stop button is pressed, I need to make sure to process the rest of the queue before killing off the thread.
It seems that everywhere I look, some sort of blocking queue is recommended for producer/consumer type situations (e.g. LinkedBlockingQueue vs ConcurrentLinkedQueue, or What's the different between LinkedBlockingQueue and ConcurrentLinkedQueue?)
My question is, does a blocking queue make sense in my situation?
The most vital thing in this app is that all data gets inserted into the db. From what I understand (please correct me if I'm wrong), but if the queue becomes full, and the consumer thread can't do inserts quickly enough to free up more queue space, then the producer is blocked from adding things to the queue? If that's right then by the time queue the has free space, a few sensor readings would have gone by and they would not be inserted into the db due to the blocking
At the end of the day, I just need the best way to ensure that data gets inserted every 10ms without skipping a beat. In my mind it makes sense to dump the values into some unbounded, limitless queue every 10ms, and have the consumer poll it as soon as it's able. Then when Stop is pressed, drain the rest of the queue before killing the thread.
So what is the correct way to handle this in a 1 producer/1 consumer situations?