I am having a problem with a for loop, where the memory usage keeps gradually increasing until there is no RAM left. The for loop goes for about 10,000 iterations and it each it should read an indexed file in a directory, perform some operations using fmincon, and then save the output to another indexed csv file. Each of the files is smaller than 1MB, and after processing 20 files the memory usage goes from 1GB to 2GB. (Initially, I thought the problem was using 'parfor', but I realized that even in a normal for loop I see this memory "leak"; click here for this question ). The code follows below, where 'my_func" is the function I am minimizing. I can provide this function upon request, but since it is "encapsulated" I would think it would not matter.
list = dir('~/Documents/matlab_files/*.csv');
L = length(list);
for i = 1:L
data = readtable(strcat('~/Documents/matlab_files/',list(i).name));
all_para = [0.03,0.3,0.001,0.001];
try
[x,fval] = fmincon(@(all_para)my_func(all_para,data),...
all_para,[],[],[],[],...
[-5,-5,0.01,0.01],...
[5,5,5,5]);
csvwrite(strcat('~/Documents/matlab_files/output/',list(i).name,'.csv'),x);
catch ME
%fprintf('without success: %s\n', ME.message);
%continue; % Jump to next iteration of: for i
end
end
Again, I wanted to use parfor originally, but I realized that the increase in memory usage happens in the regular for loop as well.
I am using 2018b