0

I have script which can be run by any user who is connected to a server. This script writes to a single log file, but there is no restriction on who can use it at one time. So multiple people could attempt to write to the log and data might be lost. Is there a way for one instance of the code to know if other instances of that code are running? Moreover, is it possible to gather this information dynamically? (ie not allow data saving for the second user until the first user has completed hes/her task)

I know I could do this with a text file. So I could write the user name to the file when the start, then delete it when they finish, but this could lead to errors if the either step misses, such as an unexpected script termination. So what other reliable ways are there?

Some information on the system: Python 2.7 is installed on a Windows 7 64-bit server via Anaconda. All connected machines are also Windows 7 64-bit. Thanks in advance

wnnmaw
  • 5,444
  • 3
  • 38
  • 63
  • 2
    http://stackoverflow.com/questions/489861/locking-a-file-in-python – Josh Lee Nov 14 '13 at 14:32
  • you can possibly write logs in user specific files – alko Nov 14 '13 at 14:33
  • So locking the log file won't work because then a user won't be able to save the work they've done. Ideally, I want to catch this right as the script starts up. @alko The log files need to be universal, though keeping them in user files temporarily and then merging them might work, but that would be a pain – wnnmaw Nov 14 '13 at 14:36
  • Have the script check the logfile lock first, before the user does any work. That way no work is lost if they can't write to the log. Then the frustration will be in figuring out when the system is available for use. User specific log files seems friendlier to the end user. You say merging logs would be a pain. Too bad there's not a way to automate that. Maybe with a scripting language, or something... :) – jwygralak67 Nov 14 '13 at 16:54
  • 1
    Wait... Is the log file the only common resource that users are competing for? If so, it seems like the real question is how to do concurrent logging from multiple sources. This is a solved problem in the unix syslog world. Surely python has a library that can achieve something similar. – jwygralak67 Nov 14 '13 at 17:00
  • If there is a python library for that I'd love to see it! There are a few other common files beside the log that get used less frequently, but the biggest issue with multiple editors is that the log is imported rather than read, so I don't know if its possible to make that dynamic – wnnmaw Nov 14 '13 at 18:10

2 Answers2

0

Here is an implementation:

http://www.evanfosmark.com/2009/01/cross-platform-file-locking-support-in-python/

If you are using a lock, be aware that stale locks (that are left by hung or crashed processes) can be a bitch. Have a process that periodically searches for locks that were created longer than X minutes ago and free them.

It just in't clean allowing multiple users to write to a single log and hoping things go ok.. why dont you write a daemon that handles logs? other processes connect to a "logging port" and in the simplest case they only succeed if no one else has connected. you can just modify the echoserver example given here: (keep a timeout in the server for all connections)

http://docs.python.org/release/2.5.2/lib/socket-example.html

If you want know exactly who logged what, and make sure no one unauthorized gets in, you can use unix sockest to restrict it to only certain uids/gids etc. here is a very good example

Community
  • 1
  • 1
staticd
  • 1,194
  • 9
  • 13
0

NTEventLogHandler is probably the easiest way for logging to a given Windows machine/server, but it might make more sense to use SyslogHandler if you have a syslog sink on a Unix server.

The catch I can think of with SyslogHandler is that you'll likely need to poke holes through the Windows firewall in order to send packets over the syslog protocol, i.e., 514/TCP ("reliable syslog") and 514/UDP (traditional or "unreliable syslog").

Enji
  • 11
  • 2