0

I want to use the same log file in multiple processes. But the processes aren't started with multiprocessing, so they have nothing shared.

While googling around, I have realised that I should make a LogHandler which receives the messages through a queue. The LogHandler then is the only process which writes to the file. But the problem is to share the queue with the other processes. How can I do that?

bb1950328
  • 1,403
  • 11
  • 18
  • Possible duplicate of [How to use multiprocessing queue in Python?](https://stackoverflow.com/questions/11515944/how-to-use-multiprocessing-queue-in-python) – blues Aug 19 '19 at 08:00
  • @blues I think not, because there you can pass the queue object when you call the method, but in my case, I somehow have to find and connect to the LogHandler process (which is already running) and get the queue – bb1950328 Aug 19 '19 at 08:07

2 Answers2

0

You can create a lock file and each process check its value before reaching the log file.

For example : lockfile.txt write "lock: 0" And first process comes read is it 0 or 1. If it is 0, write 1 and access log file. Other processes will wait until seeing 0 in lock file.

ozkulah
  • 66
  • 10
  • but what is when two processes want to aquire the lock exactly at the same time? File operations aren't atomic – bb1950328 Aug 19 '19 at 08:10
  • In a similar case I used "grep" and "sed" commands of linux, so calling one line linux command can work. – ozkulah Aug 19 '19 at 09:26
0

Python has some package such as fcntl, filelock, which can call the system locking tools. You can refer to Locking a file in Python.

LiuChang
  • 739
  • 6
  • 13