I want to know the best practices followed to share a queue (resource) between two processes in Python. Here is a what each process is doing:
Process_1: continuously gets data (in json format) from a streaming api
Process_2: is a daemon (similar to Sander Marechal's code) which commits data (one at a time) into a database
So, Process_1 (or Producer) puts a unit of data onto this shared resource, and Process_2 (or Consumer) will poll this shared resource for any new units of data, and store them in a DB, if any.
There are some options which came to my mind:
- Using pickle (drawback: extra overhead of pickling and de-pickling)
- Passing data via
stdout
of Process_1 tostdin
of Process_2 (drawback: none, but not sure how to implement this with a daemon) - Using the
pool
object in themultiprocessing
library (drawback: not sure how to code this as one process is a daemon)
I would like an optimal solution practiced in this regard, with some code :). Thanks.