0

Trying to fetch data from a URL and then parse the sensitive content.

       HttpWebRequest request = (HttpWebRequest)WebRequest.Create(URL);
       request.UseDefaultCredentials = true;
       request.PreAuthenticate = true;
       request.CookieContainer = new CookieContainer();
       request.AllowAutoRedirect = true;
       request.Credentials = CredentialCache.DefaultCredentials;
       HttpWebResponse resp = request.GetResponse() as HttpWebResponse;
       StreamReader reader = new StreamReader(resp.GetResponseStream());
       var data = reader.ReadToEnd();

If check the data, not everything got returned. How can I fetch the dynamic parts which I can see on the page like Chrome Dev tool does from browser.

Rich
  • 313
  • 1
  • 2
  • 10
  • `not everything got returned`; like what? Headers perhaps? – Stefan Mar 21 '19 at 06:59
  • like a table or grid which data fetched by Javascript. – Rich Mar 21 '19 at 07:02
  • Ah, okay, I think I get it; you are trying to get data from a website, which fetches data through ajax calls later on. Can't you just access the data sources directly? So; not retrieving the website but use it's sources instead? Also; keep in mind that not all site owners appreciate their site being scraped ;-) – Stefan Mar 21 '19 at 07:07
  • Yes, gotcha. Probably could not see the data sources. It is kind of monitoring stuff. Instead of watching the page, thinking to auto fetch by program. – Rich Mar 21 '19 at 07:12
  • [How to download html which loads using ajax](https://stackoverflow.com/questions/31951090/c-sharp-how-to-download-html-which-loads-using-ajax) no answer there either – John Wu Mar 21 '19 at 07:31
  • If open chrome's dev tool, and it is quite easy to locate all elements displaying on the page. Just do not know how they make it. – Rich Mar 21 '19 at 07:36
  • Pretty hard to do, I'd think. Might be a lot easier to use a tool like Selenium or something else that uses an actual browser to do the heavy lifting. – John Wu Mar 21 '19 at 08:21

0 Answers0