I have a set of 50 urls...and from each url, i am retrieving some data using urllib2. The procedure I am following (including setting cookies for each url) goes as follows:
urls = ['https://someurl', 'https://someurl', ...]
vals = []
for url in urls:
req2 = urllib2.Request(url)
req2.add_header('cookie', cookie)
response = urllib2.urlopen(req2)
data = response.read()
vals.append(json.loads(data))
So, basically I am retrieving data from all these urls and dumping it in vals list. This entire procedure for 50 urls takes around 15.5 to 20 seconds. I need to know if there is any other python library through which I can do the same operation but in a more faster way..or if you guys can suggest any other faster way of approaching this issue using urllib2, then it'll be fine as well. Thanks.