EDIT - FIXED tldr, semi-old version of python installed a couple years ago had ssl package that was not updated to handle newer SSL certificates. After updating Python and making sure the ssl package was up to date, everything worked.
I'm new to web scraping, and wanted to scrape a certain site, but for some reason I'm getting errors when using the Python's Requests package on this particular site.
I am working on secure login to scrape data from my user profile. The login address can be found here: https://secure.funorb.com/m=weblogin/loginform.ws?mod=hiscore_fo&ssl=0&expired=0&dest=
I'm just trying to perform simple tasks at this point, like printing the text from a get request. The following is my code.
import requests
req = requests.get('https://secure.funorb.com/m=weblogin/loginform.ws?mod=hiscore_fo&ssl=0&expired=0&dest=',verify=False)
print req.text
When I run this, an error is thrown:
File "/Library/Python/2.7/site-packages/requests/adapters.py", line 512, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:590)
I've looked in this file to see what's going on. It seems the culprit is
except (_SSLError, _HTTPError) as e:
if isinstance(e, _SSLError):
raise SSLError(e, request=request)
elif isinstance(e, ReadTimeoutError):
raise ReadTimeout(e, request=request)
else:
raise
I'm not really sure how to avoid this unfortunately, I'm kind of at my debugging limit here.
My code works just fine on other secure sites, such as https://bitbucket.org/account/signin/. I've looked at a ton of solutions on stack exchange and around the net, and a lot of people claimed adding in the optional argument "verify=False" should fix these types of SSL errors (ableit it's not the most secure way to do it). But as you can see from my code snippet, this isn't helping me.
If anyone can get this working/give advice on where to go it would be much appreciated.