1

This is a general question regarding web scraping.

When I web scrape certain sites. I receive a messages like this:

Your browser must have browser cookies turned on

I've been searching around for a solution and i'm guessing its something to do with storing cookies? I'm new to this so any guidance would be appreciated.

TeaAnyOne
  • 477
  • 1
  • 9
  • 16

2 Answers2

2
var getHtmlWeb = new HtmlAgilityPack.HtmlWeb();
getHtmlWeb.UseCookies = true;

just add getHtmlWeb.UseCookies = true;

TeaAnyOne
  • 477
  • 1
  • 9
  • 16
-2

You need to enable cookies in the browser you are using . The following wiki will guide you depending on which browser you are using.

http://www.wikihow.com/Enable-Cookies-in-Your-Internet-Web-Browser

  • I have cookies enabled though. Does HTML agility pack use Internet Explorer by default? or do I need to specify what browser to use in my C# code? – TeaAnyOne Jan 22 '16 at 18:38
  • My apologies is looks like your issue is related to HTML agility pack which I am unfamiar with. Check out this link on stack it may answer your question. http://stackoverflow.com/questions/15206644/how-to-pass-cookies-to-htmlagilitypack-or-webclient – Frank Saraceno Jan 22 '16 at 22:00
  • html agility pack is not a browser, it mimics what a browser does, you need to turn on cookies as per the accepted answer – Rob Sedgwick Sep 19 '17 at 11:14