I construct a list of dataframe by:
def compute_list_of_comprehensive_SP_series(instrument_now):
series_now = normalized_price_history[instrument_now].copy()
series_now.name = series_now.name + "_Settle"
comprehensive_SP_series = compute_comprehensive_time_series_pd_based_on_time_series_now(series_now,MA_list_to_consider, resample_to_day = False)
return comprehensive_SP_series
list_of_comprehensive_SP_series = []
for j in to_trade_instruments:
list_of_comprehensive_SP_series.append(compute_list_of_comprehensive_SP_series(j))
The memory usage has gone up from
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
3672 ******** 20 0 1457m 453m 26m S 0.0 0.4 0:05.23 python
to
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
3672 ******** 20 0 1789m 785m 26m S 0.0 0.6 0:29.41 python
The size of all data frame adding together is indeed 300+M, by doing:
total_size = 0
for j in list_of_comprehensive_SP_series:
total_size+=sys.getsizeof(j)
print total_size/1e9
outputting:
0.382917456
However, after I delete the list of dataframe and collect the memory back:
del list_of_comprehensive_SP_series
import gc
gc.collect()
The memory usage does not drop at all. This is quite frustrating in my memory intensive application. Any help?
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
3724 ******** 20 0 1788m 784m 26m S 0.0 0.6 0:29.35 python