Ubuntu 16.04. I made virtualenv with python 3.6.5 . Inside virtualenv I typed:
pip install scrapy
this is OUTPUT:
pip is configured with locations that require TLS/SSL, however the ssl module in Python is not available.
Collecting scrapy
Cache entry deserialization failed, entry ignored
Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError("Can't connect to HTTPS URL because the SSL module is not available.",)': /simple/scrapy/
Retrying (Retry(total=3, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError("Can't connect to HTTPS URL because the SSL module is not available.",)': /simple/scrapy/
Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError("Can't connect to HTTPS URL because the SSL module is not available.",)': /simple/scrapy/
Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError("Can't connect to HTTPS URL because the SSL module is not available.",)': /simple/scrapy/
Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError("Can't connect to HTTPS URL because the SSL module is not available.",)': /simple/scrapy/
Could not fetch URL https://pypi.python.org/simple/scrapy/: There was a problem confirming the ssl certificate: HTTPSConnectionPool(host='pypi.org', port=443): Max retries exceeded with url: /simple/scrapy/ (Caused by SSLError("Can't connect to HTTPS URL because the SSL module is not available.",)) - skipping
Could not find a version that satisfies the requirement scrapy (from versions: )
No matching distribution found for scrapy
Scrapy is only example. When I want install anything I have this type of error. If You have any questions about this problem, please ask me.