0

I have spent hours pouring through other questions and have not been able to find a solution to my problem.

I have a program that calls my python script with an argument at certain events. Basically every time my script is called I need it to increment a variable by one and save it to a file. I tried doing it directly (open, increment by one, and save file) every time it was called but I had a loss of precision due to how fast the calls were being made at times. I am trying to figure out how to have a script run and just count, then every X minutes write the count to a file. I have the write to file part working great, I just need help with how to keep up with the variable count and write it to a file every X minutes.

I tried doing this with a single script using threading but every time it was called the global variable declarations would overwrite the count. Thanks for the help ahead of time.

go-oleg
  • 19,272
  • 3
  • 43
  • 44
  • Each time the script runs, it's supposed to remember the count from the previous time it ran? – DarenW Jul 22 '13 at 03:22
  • The call could send increment tics to a queue, and then every few minutes either the routine or the main program could update the file holding the counts. – Jiminion Jul 22 '13 at 04:10

1 Answers1

1

I read through your query and I researched about this on the internet too. Prior to describing my research I had a couple of queries:

  1. Is the main program that calls your python script say callee.py also a python script? If yes, is this main script written by you or do you have the permission to change the source code in the main script?
  2. Do you require the call to your python script to be asynchronous?

I ask this because in case the main program is a python script that is written by you or you can make some changes to the source code and that the call to callee.py is not needed to be asynchronous then you can use subprocess.call method which will wait for callee.py to complete. Within callee.py, you will have the open file-update count-save to file operation that you have already developed. Since, the calls are synchronous in nature you wont have to worry about missed or incorrect counts.

However, in case your approach is that the main program needs to call callee.py asynchronusly or that main is a non-python program (an exe) then you will have to ensure a synchronized update to the file storing the counts. A query about how to do this was already asked on Stack Overflow. Also, please refer to the below link on Cross Platform File Locking in Python for more information on how you can lock a file, update it and then release the lock.

This approach will work whenever there are multiple executions to your script from within the main program in an asynch fashion.

I hope this points you in the direction of a solution

Community
  • 1
  • 1
Prahalad Deshpande
  • 4,709
  • 1
  • 20
  • 22
  • Thanks. 1. Yes, I can change it but only minimally and in no way that would hinder/slow its performance. 2. Yes. It could be upwards of 100/sec Unfortunately I can not wait for callee.py to complete. Would locking the file queue up callee.py and execute them one by one or would I potentially have a loss of counts? I was running into a situation where counts were lost since the file was being accessed simultaneously. Thanks for the links, I am checking them out as well. – user2605404 Jul 23 '13 at 02:54
  • Yes, this is a classical case of concurrent updates to a single shared resource (in your case it is the file storing the counts). Synchronizing the access to this file is the only sure way to ensure correct counts within an asynch scenario (since asynch calls typically end up running on a separate thread than the one on which the main program is running). Also since there are many such asynch calls in your case, this is indeed an issue related to concurrency. – Prahalad Deshpande Jul 23 '13 at 04:22
  • So locking the file should solve my issue? I just want to be sure :) – user2605404 Jul 24 '13 at 02:29