This code is generating 13k urls and then checking if response = 200. I have to check each one because there is no other way to know if they exist or not. If it is then proceeds to download an image.
It also checks if the image already exists in the folder. It's working but it's taking hours to finish.
I'm using requests, shutil
, os
and tqdm
. I'm new to python and I was researching and found asyncio
and aiohttp
, watched a couple of tutorials but didn't manage to make it work.
def downloader(urls):
for i in tqdm(urls):
r = requests.get(i, headers=headers, stream=True)
if r.status_code == 200:
name, path = get_name_path(i)
if check_dupe(name) == False:
save_file(path, r)
folder_path = create_dir()
urls = generate_links()
downloader(urls)