0

Have code for download page:

 System.Net.WebClient client = new System.Net.WebClient();
 client.Headers.Add("user-agent", "Mozilla/20.0.1");
 byte[] feedBytes;
 string url;
 url = @"http://www.marathonbet.co.uk/en/betting/Football";
 string fullPage = string.Empty;
 try
 {
      feedBytes = client.DownloadData(url);
 }
 catch (System.Net.WebException)
 {
      return;
 }
string fullPage = Encoding.UTF8.GetString(feedBytes);

As result 'fullpage' contains only part of page. In the browser loading of the page happens gradually. How to download full page?

John Saunders
  • 160,644
  • 26
  • 247
  • 397
Shootnik
  • 9
  • 2
  • 3
    Why are you ignoring exceptions? Remove that try/catch block and see what happens. At the least, capture the exception and display `ex.ToString()` so you know what's wrong. – John Saunders Jun 04 '13 at 20:00
  • Your code works for me. I get the full page in the `fullPage` after executing your code... – nemesv Jun 04 '13 at 20:04
  • nemsv - I think you had receive about 300 Kb of info, full page is more than 2 Mb – Shootnik Jun 04 '13 at 20:07
  • Add to catch 'Console.WriteLine("WebException");'. Result not changed. – Shootnik Jun 04 '13 at 20:14
  • Very likely the problem is that the page uses JavaScript to generate content, and `WebClient.DownloadData` doesn't execute JavaScript; it just downloads that one page. – Jim Mischel Jun 04 '13 at 20:58
  • Are exist method or class for working with JavaScript? – Shootnik Jun 05 '13 at 08:32

1 Answers1

0

Try this

public static string GetHTMLDocument(string url)
        {
            var result = "";
            using (var wc = new WebClient())
            {
                result = wc.DownloadString(url);
            }
            return result;
        }
Smeegs
  • 9,151
  • 5
  • 42
  • 78