Im trying to download files at the same time.
-Download string on the main Link
-Take the subLinks from it
-Download subLinks at the same time( 590~ subLinks to download) Like a tree.
static async Task Direct()
{
WebClient wc2 = new WebClient() { UseDefaultCredentials = true, Encoding = Encoding.GetEncoding(1251) };
HtmlDocument hd = new HtmlDocument();
wc2.Proxy.Credentials = CredentialCache.DefaultCredentials;
hd.LoadHtml(wc2.DownloadString(link));
foreach (var item in hd.DocumentNode.SelectNodes(xpath).Select(x => x.GetAttributeValue("href", "")))
{
Task.Run(() =>
{
using (WebClient wc = new WebClient() { UseDefaultCredentials = true, Encoding = Encoding.GetEncoding(1251) })
{
wc.DownloadStringCompleted += Over;
wc.Proxy.Credentials = CredentialCache.DefaultCredentials;
wc.DownloadStringAsync(new Uri(item));
}
});
}
Console.Title = "ready";
}
static Stopwatch sw = new Stopwatch();
static void Over(object sender, DownloadStringCompletedEventArgs e)
{
Task.Run(()=>overTask((sender as WebClient)));
}
static async Task overTask(WebClient sender)
{
if (sw.IsRunning)
{
sw.Stop();
Console.WriteLine(sw.ElapsedMilliseconds);
sw.Reset();
}
sw.Start();
sender.Dispose();
}
1-This code i have, it downloads the subLinks but not the same time, slower than it. It takes DownloadStringAsync request and I hope maximum 20 seconds to all strings, but my big Stopwatch says min. 35 sec.
2-This 1000ms, i expect that max. 20 ms. because they have to started at the same time, They have not much difference between them. Same website,different pages.
I make a few changes in the first code. There is some improvements, but still, slower than i excepted.The Result.
as @Liam says, I tried to make it via thread.
static async Task Direct()
{
WebClient wc2 = new WebClient() { UseDefaultCredentials = true, Encoding = Encoding.GetEncoding(1251) };
HtmlDocument hd = new HtmlDocument();
wc2.Proxy.Credentials = CredentialCache.DefaultCredentials;
hd.LoadHtml(wc2.DownloadString(link));
foreach (var item in hd.DocumentNode.SelectNodes(xpath).Select(x => x.GetAttributeValue("href", "")))
{
/* Task.Run(() =>
{
using (WebClient wc = new WebClient() { UseDefaultCredentials = true, Encoding = Encoding.GetEncoding(1251) })
{
wc.DownloadStringCompleted += Over;
wc.Proxy.Credentials = CredentialCache.DefaultCredentials;
wc.DownloadStringAsync(new Uri(item));
}
});*/
new Thread(() =>
{
using (WebClient wc = new WebClient() { UseDefaultCredentials = true, Encoding = Encoding.GetEncoding(1251) })
{
wc.DownloadStringCompleted += Over;
wc.Proxy.Credentials = CredentialCache.DefaultCredentials;
wc.DownloadStringAsync(new Uri(item));
}
}).Start();
}
Console.Title = "ready";
}
static Stopwatch sw = new Stopwatch();
static void Over(object sender, DownloadStringCompletedEventArgs e)
{
new Thread(() =>
{
if (sw.IsRunning)
{
sw.Stop();
Console.WriteLine(sw.ElapsedMilliseconds);
sw.Reset();
}
sw.Start();
}).Start();
the output ms(the ms between two downloads) is here
EDIT:Guys, the foreach loop must started them via little differences. It question hasn't got the answer. I said "If the tasks started at the same time(at least, little differences between them)and they download things that have same length of string(20-30 char is changed), so why is the difference between them is too much? Is it depends on the website? Can we make that faster?