1

When i use requests.get() function in python3 using following commands

import requests
res = requests.get('http://www.gutenberg.org/cache/epub/1112/pg1112.txt')

Then python3 throws the following error:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 557, in urlopen
    body=body, headers=headers)
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 351, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib/python3.4/http/client.py", line 1137, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python3.4/http/client.py", line 1182, in _send_request
    self.endheaders(body)
  File "/usr/lib/python3.4/http/client.py", line 1133, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python3.4/http/client.py", line 963, in _send_output
    self.send(msg)
  File "/usr/lib/python3.4/http/client.py", line 898, in send
    self.connect()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 155, in connect
    conn = self._new_conn()
  File "/usr/lib/python3/dist-packages/urllib3/connection.py", line 134, in _new_conn
    (self.host, self.port), self.timeout, **extra_kw)
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 90, in create_connection
    raise err
  File "/usr/lib/python3/dist-packages/urllib3/util/connection.py", line 80, in create_connection
    sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 370, in send
    timeout=timeout
  File "/usr/lib/python3/dist-packages/urllib3/connectionpool.py", line 607, in urlopen
    _stacktrace=sys.exc_info()[2])
  File "/usr/lib/python3/dist-packages/urllib3/util/retry.py", line 271, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='192.168.15.2', port=8000): Max retries exceeded with url: http://www.gutenberg.org/cache/epub/1112/pg1112.txt (Caused by ProxyError('Cannot connect to proxy.', TimeoutError(110, 'Connection timed out')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python3/dist-packages/requests/api.py", line 69, in get
    return request('get', url, params=params, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/api.py", line 50, in request
    response = session.request(method=method, url=url, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 465, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/lib/python3/dist-packages/requests/sessions.py", line 573, in send
    r = adapter.send(request, **kwargs)
  File "/usr/lib/python3/dist-packages/requests/adapters.py", line 424, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='192.168.15.2', port=8000): Max retries exceeded with url: http://www.gutenberg.org/cache/epub/1112/pg1112.txt (Caused by ProxyError('Cannot connect to proxy.', TimeoutError(110, 'Connection timed out')))

As far as I know it says no internet connection, but my internet is working fine. So why python is throwing this error?

winhowes
  • 7,845
  • 5
  • 28
  • 39
Rohit Raj
  • 329
  • 4
  • 12

2 Answers2

0

You can increase the timeout with (in seconds): requests.get('http://www.gutenberg.org/cache/epub/1112/pg1112.txt', timeout=30)

alpert
  • 4,500
  • 1
  • 17
  • 27
  • Works for me also without timeout parameter. What happens when you type: `wget http://www.gutenberg.org/cache/epub/1112/pg1112.txt` – alpert May 03 '16 at 05:26
  • It keeps giving this error (Connecting to 192.168.15.2:8000... failed: Connection timed out.) I tried using wget with different sites( for ex http://ftp.gnu.org/gnu/wget/wget-1.5.3.tar.gz) but still got the same error – Rohit Raj May 10 '16 at 13:06
  • If you cant run wget either there should be a problem about your network and you cant reach there. – alpert May 10 '16 at 13:09
  • What is in 192.168.15.2:8000? Proxy or something? – alpert May 10 '16 at 14:21
  • I have dualboot on my pc. On both Windows and Linux i am getting the same error. Even when using requests.get on different websites. Since the module is so popular i am surely making some elementary error. But i cannot figure out what. – Rohit Raj May 12 '16 at 08:09
0

Found answer using alpbert help and this thread Proxies with Python 'Requests' module

I dont have any proxy but python was still trying to detect a proxy. So i created a dict element

proxies={'http':''}

then this command worked

res = requests.get('http://www.gutenberg.org/cache/epub/1112/pg1112.txt',proxies=proxies)

Community
  • 1
  • 1
Rohit Raj
  • 329
  • 4
  • 12