0

Hi I have few files in a domain, say

example.com/text1.txt
example.com/text2.txt 
example.com/text3.txt
example.com/text50.txt

upto 50 links

I'm looking for an script to read all this links and save it's content into a local disk or into a single text file.

Thanks in advance

Ashkar
  • 712
  • 6
  • 17

1 Answers1

0

Thanks for all your help, Finally I found the solution using Python. Below is the script

import urllib2  # the lib that handles the url stuff
    for num in range(100001,150000):
        url = "example.com/%s.txt" % (num)
        data = urllib2.urlopen(url) # it's a file like object and works just like a file
        file = open("testfile.txt","a")
        for line in data: # files are iterable
            file.write(line)
        file.write("\n")

This will save all the text files entry into a single file named testfile.txt

Ashkar
  • 712
  • 6
  • 17