I'm writing a function that given the path of a folder, will download all FTP files from this URL (open in Chrome) and will also return a structure or cell array with the information from those files.
So far, I had no trouble with most of files, but the ones that have lots of rows of data are giving me trouble (for example, 'british.txt'). I cannot read data from files that have 2000 rows and above. For those files who are shorter, I have no problems with.
The relevant part of my function that uses textscan is this:
list_txt=dir('*.txt'); %After downloading the files, I list them by its extension.
txt_data=cell(numel(list_txt),8); %pre-allocation of cell array that will store all the info I need from the loop.
for kk=1:numel(list_txt)
fid=fopen(list_txt(kk).name,'rt');
txt_single=textscan(fid,'%s %s %s %s %s %f %f %s','Delimiter','|','headerLines', 2);
fclose(fid);
[m2,n2]=size(txt_single{1,1});
for jj=1:n2
txt_single{1,n2}(m2)=[]; %I want to get rid of the last row that says #EOF
end
txt_data(kk,1:8)=txt_single; %I go and store information of every file per row in this cell array
end
Now, I saw this post and tried some of the approaches, but I couldn't get any results from it, maybe I'm doing something wrong.
I'll be glad if anyone can help me out on this.
EDIT:
Some files are not being read in its entirety (british, US, swedish and a couple more).
So I tried performing the operation in the command window, using this code for, let's say, 'british.txt':
blockSize=1000;
fid=fopen('british.txt','rt');
txt_single=textscan(fid,'%s %s %s %s %s %f %f %s',blockSize,'Delimiter','|','headerLines', 2);
fclose(fid);
The outcome is not even close to what I desire, which is reading around 2000 rows of information.