1

I have a python program that is running for many days. Memory usage does not increase by very much, however the program becomes slower and slower. Is there a tool or utility that will list all function calls and how long they take to complete? Something like guppy/heapy but for CPU usage.

rook
  • 66,304
  • 38
  • 162
  • 239

2 Answers2

2

Edit2

I just saw your actual question is answered as in 'How can you profile a Python script?'


Sure, use profile.py.

import profile

def myfunction():
    bla bla bla

profile.run('myfunction()')

see also profilers and tips on performance.

Edit: Above example is for one function. You can profile and run your script from the commandline with cProfile with:

python -m cProfile myscript.py

Your program/script could look also like the following for profiling always when run:

def myfunction():
    for i in range(100):
        print(i)

def myotherfunction():
    for i in range(200):
        print(i)

def main():
    """ main program to run over several days """
    for _ in range(3):
        myfunction()

    myotherfunction()

if __name__ == '__main__':
    profile.run('main()') # will execute your program
                          # and show profiling results afterwards
Community
  • 1
  • 1
Remi
  • 20,619
  • 8
  • 57
  • 41
  • But i want to run this over my entire application over the course of days. +1 anyway. – rook Sep 15 '11 at 16:06
  • 1
    You can, but it will probably slow down your program even more because of the overhead. Can't you run it just for a while, check where the bottlenecks are and work on them? – Remi Sep 15 '11 at 16:31
0

get the pid of the process

ps -ef | grep "processname" | awk '{print $2}

and then see which calls in the program are taking more time.

strace -c -p "pid"

you can even run the whole script with

strace -c script.file

krisdigitx
  • 7,068
  • 20
  • 61
  • 97