0

I am using The Grinder and I have a Python script which executes some Java APIs to gather the minimum, maximum, number of executions, and total executions (the latter 2 to get the average execution times). This is done for each API (it's a multidimensional array) and each thread.

#Contents of apiTimingsList array: [min, max, number of executions, total execution time]
apiTimingsList = [[9999,0,0,0] for j in range(len(apiList))]

I am investigating some memory issues and I think that the growing size of this array might be a problem. It will grow constantly as the test runs. For example, if I have 10 APIs and I am running 900 threads, there are 9000 arrays that will keep growing as long as the test runs.

Is there a way to limit the size of these arrays, to say only keep the last x number of executions so my calculations are still valid but the arrays are not growing out of control?

Matt
  • 2,503
  • 4
  • 31
  • 46

2 Answers2

3

You can use collections.deque:

>>> from collections import deque
>>> d = deque(maxlen=2)
>>> d.append(3)
>>> d.append(4)
>>> d.append(5)
>>> d
    deque([4, 5], maxlen=2)
Blender
  • 289,723
  • 53
  • 439
  • 496
1

A deque from the collections module will probably accomplish what you want.

g.d.d.c
  • 46,865
  • 9
  • 101
  • 111