I'm making a tool for downloading images from the internet concurrently using a List<Uri>
and the WebClient
class. Here is the relevant code:
The new WebClient that I am using:
public class PatientWebClient : WebClient
{
protected override WebRequest GetWebRequest(Uri uri)
{
WebRequest w = base.GetWebRequest(uri);
w.Timeout = Timeout.Infinite;
return w;
}
}
and the download methods:
public static void DownloadFiles()
{
string filename = string.Empty;
while (_count < _images.Count())
{
PatientWebClient client = new PatientWebClient();
client.DownloadDataCompleted += DownloadCompleted;
filename = _images[_count].Segments.Last().ToString();
if (!File.Exists(_destinationFolder + @"\" + filename))
{
try
{
client.DownloadDataAsync(_images[_count], _images[_count]);
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
}
++_count;
}
}
private static void DownloadCompleted(object sender, DownloadDataCompletedEventArgs e)
{
if (e.Error == null)
{
Uri uri = (Uri)e.UserState;
string saveFilename = uri.Segments.Last().ToString();
byte[] fileData = e.Result;
if (saveFilename.EndsWith(".jpg") || saveFilename.EndsWith(".png") || saveFilename.EndsWith(".gif"))
using (FileStream fileStream = new FileStream(_destinationFolder + @"\" + saveFilename, FileMode.Create))
fileStream.Write(fileData, 0, fileData.Length);
else
using (FileStream fileStream = new FileStream(_destinationFolder + @"\" + saveFilename + ".jpg", FileMode.Create))
fileStream.Write(fileData, 0, fileData.Length);
++_downloadedCounter;
((WebClient)sender).Dispose();
}
}
The issue is that not all of the images from the list _images
are being downloaded. If I click the download button a second time more will be downloaded and it actually takes a few clicks to bring everything down. Are the WebClient
s timing out, and if so is there a way to get them to automatically retry the download? If not, what is the proper way to go about resolving this issue?