I am using The Grinder and I have a Python script which executes some Java APIs to gather the minimum, maximum, number of executions, and total executions (the latter 2 to get the average execution times). This is done for each API (it's a multidimensional array) and each thread.
#Contents of apiTimingsList array: [min, max, number of executions, total execution time]
apiTimingsList = [[9999,0,0,0] for j in range(len(apiList))]
I am investigating some memory issues and I think that the growing size of this array might be a problem. It will grow constantly as the test runs. For example, if I have 10 APIs and I am running 900 threads, there are 9000 arrays that will keep growing as long as the test runs.
Is there a way to limit the size of these arrays, to say only keep the last x number of executions so my calculations are still valid but the arrays are not growing out of control?