Just on a hunch, try this. First, add a new class to represent a set of parameters for myfunc
. (It could even be Tuple<string, string>
).
public class MyFuncParameters
{
public string UserName { get; set; }
public string File { get; set; }
}
Then modify your original method like this:
StreamReader ImportFile = new StreamReader(@"c:\users\matthew\desktop\test.txt");
string line;
var filesToProcess = new List<MyFuncParameters>();
while ((line = ImportFile.ReadLine()) != null)
{
doneTotal++;
string[] info = line.Split('-');
string username = info.Length >= 1 ? info[0] : null;
string file = info.Length >= 2 ? info[1] : null;
filesToProcess.Add(new MyFuncParameters {File = file, UserName = username});
}
foreach (var fileToProcess in filesToProcess)
{
myfunc(fileToProcess.UserName, fileToProcess.File);
}
In other words, first read everything you need from the one file, and then if you're iterating through another list of files (created from the original file) do that next. You may see some improved performance by not reading one file and then doing something (myfunc
) with another file in between reads to the original file.
That's a guess. It very likely depends on what exactly myfunc
does since you indicated that's the part that's slow.
As stated in the comments, you can launch as many parallel threads to read files as you want, but only one of them can actually read from the disk at a time so it doesn't really do any good. It could even make it slower.