I have a huge list of web pages which display a status, which i need to check. Some urls are within the same site, another set is located on another site.
Right now i'm trying to do this in a parallel way by using code like below, but i have the feeling that i'm causing too much overhead.
while(ListOfUrls.Count > 0){
Parallel.ForEach(ListOfUrls, url =>
{
WebClient webClient = new WebClient();
webClient.DownloadString(url);
... run my checks here..
});
ListOfUrls = GetNewUrls.....
}
Can this be done with less overhead, and some more control over how many webclients and connections i use/reuse? So, that in the end the job can be done faster?