Im writing a web scraping program in python using mechanize. The problem I'm having is that the website I'm scraping from limits the amount of time that you can be on the website. When I was doing everything by hand, I would use a SOCKS proxy as a work-around.
What I tried to do is go to the network preferences (Macbook Pro Retina 13', mavericks) and change to the proxy. However, the program didn't respond to that change. It kept running without the proxy.
Then I added .set_proxies() so now the code to open the website looks something like this:
b=mechanize.Browser() #open browser
b.set_proxies({"http":"96.8.113.76:8080"}) #proxy
DBJ=b.open(URL) #open url
When I ran the program, I got this error:
Traceback (most recent call last):
File "GM1.py", line 74, in <module>
DBJ=b.open(URL)
File "build/bdist.macosx-10.9-intel/egg/mechanize/_mechanize.py", line 203, in open
File "build/bdist.macosx-10.9-intel/egg/mechanize/_mechanize.py", line 230, in _mech_open
File "build/bdist.macosx-10.9-intel/egg/mechanize/_opener.py", line 193, in open
File "build/bdist.macosx-10.9-intel/egg/mechanize/_urllib2_fork.py", line 344, in _open
File "build/bdist.macosx-10.9-intel/egg/mechanize/_urllib2_fork.py", line 332, in _call_chain
File "build/bdist.macosx-10.9-intel/egg/mechanize/_urllib2_fork.py", line 1142, in http_open
File "build/bdist.macosx-10.9-intel/egg/mechanize/_urllib2_fork.py", line 1118, in do_open
urllib2.URLError: <urlopen error [Errno 54] Connection reset by peer>
Im assuming that the proxy was changed and that this error is in response to that proxy.
Maybe I am misusing .set_proxies().
Im not sure if the proxy itself is the issue or the connection is really slow.
Should I even be using SOCKS proxies for this type of thing or is there a better alternative for what I am trying to do?
Any information would be extremely helpful. Thanks in advance.