I currently try to implement a simple webdownloader, which downloads files recursive throughout the only directory.
What i got to list the files on the server: Updater.cs:
public static List<string> remote_filecheck()
{
List<string> rfiles = new List<string>();
string url = "http://********/patchlist.txt";
WebClient client = new WebClient();
client.DownloadFile(url, @"patchlist.txt");
string line;
StreamReader reader = new StreamReader("patchlist.txt");
while ((line = reader.ReadLine()) != null)
{
rfiles.Add(line);
}
reader.Close();
return rfiles;
}
I currently work with a patchlist, which consists of all direct links to my http files.
I tried nearly every single snippet on the web concerning recursive download e.g. RegEx, WebRequests and stuff.
Now i want to know if you got a good way to go recursive through my HTTP Server and list all the filenames, which is all i want to know.
When i have a List<string>
of filenames, then i am able to do the rest.