I want to search urls in a list, which add up to approximately 74 urls. I used try and except to tell python skip sites that do not respond (with http error code 400, etc.)
from googleapiclient.discovery import build
service = build("customsearch", "v1", developerKey='keyhere')
for i in range(0,k-1):
try:
queries = search_sources[i]
res = service.cse().list(q= queries, cx='idhere',).execute()
except urllib2.HTTPError:
continue
but the problem is that I get the message about some sites has error (which means that except part is not working):
Traceback (most recent call last):
File "<stdin>", line 4, in <module>
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/oauth2client/_helpers.py", line 133, in positional_wrapper
return wrapped(*args, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/googleapiclient/http.py", line 838, in execute
raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://www.googleapis.com/customsearch/v1?q=site%3Ahttp%3A%2F%2Fbit.ly%2FgktvnmChina+protest&alt=json&cx=myid&key=mykey returned "Bad Request">