3

The following get request works on my personal computer:

import requests
requests.get('https://example.org')

However, the same request on my work laptop results in this error:

ConnectionError: ('Connection aborted.', TimeoutError(10060, 'A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond', None, 10060, None))

What i've tried:

    1. Setting up proxy

Based on these answers given, i've tried to set up the proxies using the following code below but it still returned the same error.

requests.get('https://example.org',
             proxies=urllib.request.getproxies())

However, when I did a check at http://www.whatismyproxy.com/, it says that "no proxies were detected". urllib.request.getproxies() also returned a blank dict {}

enter image description here

    1. Adding SSL cert

Also tried adding SSL cert but still had the same error

requests.get('https://example.org',
             proxies=urllib.request.getproxies(),
             verify=requests.certs.where())

So I'm not sure what else I can do so that the get request can work on my work laptop.

Jon
  • 137
  • 6
  • Can you access the URL using a browser on your company network? There may be a firewall in place that's blocking certain outbound connections – DarkKnight Oct 17 '22 at 08:35
  • @OldBill Yes, I can access it on my company network – Jon Oct 17 '22 at 08:37
  • Have you tried another link like google.com on your work computer? maybe python.exe is blocked on firewall settings – Ali Ent Oct 17 '22 at 08:48
  • @AliEnt Google works, i've just tried `requests.get('https://www.google.com/')` and it returned ``, but it doesn't work for the link that i want to access – Jon Oct 17 '22 at 08:56
  • Test Google with proxy to make sure the proxy is working – Ali Ent Oct 17 '22 at 08:59
  • @Jon Try adding a User-Agent header – DarkKnight Oct 17 '22 at 08:59
  • @AliEnt `requests.get('https://www.google.com/', proxies=urllib.request.getproxies())` still works but `urllib.request.getproxies()` returns an empty dict anyway so is there another way to find the proxy ip? – Jon Oct 17 '22 at 09:04
  • @OldBill Tried adding `headers={'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.0.0 Safari/537.36'}`, works for google but not for other links – Jon Oct 17 '22 at 09:07
  • There are some websites that give you free proxies, you can search "free proxy list". but my experience was bad with them, most of them are not working. If you have problem with them you can buy one. – Ali Ent Oct 17 '22 at 09:09
  • @Jon Facing the same issue, have u found anything that worked? – ZooPanda Nov 04 '22 at 08:18

1 Answers1

0

I just reported a bug on Python, and it might be regarding this same issue. Basically I am using airflow (python) on a linux server, and when I try use oauth it works on linux, but crashes on MacOS. I traced it down to a simple change in a proxy detection function! Hope it works.

https://github.com/python/cpython/issues/104578

  • Hey I don't think this is related. You say "crashes" but it looks like it only raises some exception, which isn't the issue in the question. Also being a work laptop its unlikely he's using a mac. – Peter May 17 '23 at 13:44
  • @Peter When you call "getproxies_environment" instead of "getproxies_macosx_sysconf" on MacOS the python thread actually some times "crashes" as in it stops and the system reports the thread crashed. Also, do not assume it is not a Mac. It could be very likely that the laptop is a Mac. In our company we provide Mac laptops to all our IT people. – Marcelo Gaio May 17 '23 at 13:56
  • Also, at the time I detected this bug, i also detected a lot of issues in the underlying code that manages the proxy call, where there are mismanaged exceptions (catching an exception but not returning anything, or raising another exception), so In my case, when the thread didn't crash, it just didn't return anything, causing a timeout. – Marcelo Gaio May 17 '23 at 13:59
  • Ah right apologies then, though if it causes the process to legitimately crash, I wouldn't have thought catching an exception would make much of a difference. For the sake of the question though, he _did_ say `urllib.request.getproxies()` returns an empty dict so that part of the code doesn't appear to have an issue. – Peter May 17 '23 at 16:38
  • @Peter Actually I apologize. It IS related to the proxy functions, but its not exactly urllib.request.getproxies(), from what it seems. Somebody pointed out in github that the issue is related to "the Python _scproxy helper module to get proxy configurations on macOS uses macOS system frameworks that are documented as not being safe when called in a forked process without an exec" and it actually aborts the process, instead of throwing an exception.. Awful! – Marcelo Gaio May 17 '23 at 19:09