Here are some ways to collect the output:
if the data is pretty small and well-formatted, like just 1 line for each URL, you can just copy the output from the console prints.
if the data is very big, I assume this is your situation, you can write the output into files.
import requests
url = 'url goes here'
r = requests.get(url)
print(r.text)
with open('/path/to/file.txt', 'w', encoding='utf-8') as f:
f.write('r.text')
- if you have thousands of URL, and need to write into thousand files, just add a for loop for each url and write the output to different files.
above example are using a txt file, you can also write the output into a .xml file or .html file, any format that is more convenient to re-use for you, like docx, excel, csv, json, etc.