There isn't much you can do on the python side.
If this issue isn't a problem for you, you might consider using the latest simplejson
, which is significantly faster for loading than the standard library json
. Keep in mind that while the deserialization is faster when comparing the libraries directly, the difference might not be worth it when you consider your whole request/response cycle.
For running parallel requests, you should try grequests:
urls = ["http://nominatim.openstreetmap.org/reverse?format=json&lat=52.5487429714954&lon=-1.81602098644987&zoom=18&addressdetails=1",
....
]
requests = (grequests.get(u) for u in urls)
responses = grequests.map(requests)
for r in responses:
print r.json()
Obviously, even if you start 50 requests in parallel, you're bound by your network and the remote server performance.