0

I'm looking for a pythonic and simple way to synchronously share a common data source across multiple Python processes.

I've been thinking about using Pyro4 or Flask to write a kind of a CRUD service that I can get and put objects from and into. But Flask appears to be a lot of coding for a simple task and Pyro4 seems to require some name service.

Do you know of any (preferably easy to use, matured, high-level) library or package that provides centralized storage and high performance access to objects shared across multiple Python processes?

Hendrik Wiese
  • 2,010
  • 3
  • 22
  • 49

1 Answers1

1

Take a look at Redis

Redis is an in-memory key-value database.

And you can download redis-py to use redis with python

Corentin Limier
  • 4,946
  • 1
  • 13
  • 24
  • Well, with Redis I'd have to serialize the objects manually myself. I actually thought of something that would simply accept a Python object and store it in a way that other processes can work with it, call its methods and it gets updated internally so that all processes working with it are constantly in sync. I know there's some concurrency issues, possible dead locks, race conditions and things like that to consider. That's why I'm asking for some matured library that already took all this stuff into account. – Hendrik Wiese Sep 14 '18 at 14:32
  • @HendrikWiese how do you manage multiprocessing ? With multiprocessing library or do you simply launch multiple scripts at the same time ? – Corentin Limier Sep 14 '18 at 14:38
  • The latter actually. – Hendrik Wiese Sep 14 '18 at 15:35
  • @HendrikWiese I do not know a simple solution for your problem if you run multiple processes like this. Maybe you should try to use multiprocessing library. More informations here https://stackoverflow.com/questions/3671666/sharing-a-complex-object-between-python-processes . If this is overkill for your problem, i would go with redis but maybe others will find a better solution – Corentin Limier Sep 14 '18 at 15:40