I have a financial pricing application written in Python 2.4.4 which runs as an Excel plugin. Excel has a 1GB memory limit for all addins, so if any addin process tries to allocate more than 1Gb in total it will cause Excel to crash.
I've recently made a change to the program which may have changed the overall memory requirements of the program. I'd like to work out if anything has changed sigifnicantly, and if not I can reassure my management that there's no chance of a failure due to an increase in memory usage.
Incidentally, it's possible to run the same function which runs in Excel from a command-line:
Effectively I can provide all the arguments that would be generated from Excel from the regular windows prompt. This means that if I have a method to discover the memory requirements of the process on it's own I can safely infer that it will use a similar amount when run from Excel.
I'm not interested in the detail that a memory profiler might give: I do not need to know how much memory each function tries to allocate. What I do need to know is the smallest amount of memory that the program will require in order to run and I need to guarantee that if the program is run within a limit of 1Gb it will run OK.
Any suggestions as to how I can do this?
Platform is Windows XP 32bit, python 2.4.4