1

I have a script which does a lot of computational work and is very resource intensive. When I am running the script (which typically takes several hours to complete), I am almost unable to use my machine, because it grinds to a halt.

I remember in the old days of VB programming, there was a yield() statement, which forced a memory hogging routine to be nice and hand over some CPU cycles to other processes.

My question is, is there a similar construct in Python which allows me to write scripts that play nicely with other processes on my machine?

Typical script below ....

# import required libs

if __name__ == '__main__':
    init()
    do_some_expensive_calcs()  # need to periodically 'yield' to other processes here - how do I do it?
Homunculus Reticulli
  • 65,167
  • 81
  • 216
  • 341
  • 2
    You can be `nice`: http://stackoverflow.com/questions/1689505/python-ulimit-and-nice-for-subprocess-call-subprocess-popen – eumiro Jul 27 '12 at 08:04
  • 3
    The problem with memory hogging processes is that they force everything into swap. If this is what is happening to you then freeing the CPU isn't going to help. – John La Rooy Jul 27 '12 at 08:05

2 Answers2

1

I think you can also do this from within the script itself with the OS package:

import os
os.nice(100)
anotherdave
  • 6,656
  • 4
  • 34
  • 65
0

If you are on a *ix machine, start your program nicely:

nice ./prog

You can also "renice" the program while it is running, e.g. with top.

The other strategy which I always worth thinking about is to improve the algorithm.

steffen
  • 8,572
  • 11
  • 52
  • 90