3

Right now script open URL and check for RSS feeds, but there is problem. Let's say site hang-up, it's like loading but not giving any information. So my script too hang-up and it doesn't retrieve any RSS feeds. After that there is only solution - restart python script. Is there any way set a timeout or something on connection, so if first attempt fails, it trys after 60 seconds(or less).

        def request_refresh(self):
          #Open file for RSS entries/duplicates
          FILE = open(self.request_entries, "r")
          filetext = FILE.read()
          FILE.close()
          for feed in feeds['request']:
            d = feedparser.parse(feed)
            for entry in d.entries:
            #title url and description codes goes here
                #Write RSS feed in log file(URL|title)
                FILE = open(self.request_entries, "a")
                FILE.write('{}\n'.format(id_request))
                FILE.close()
         #Start checking RSS feeds again
         threading.Timer(5.0, self.request_refresh).start()
ZeroSuf3r
  • 1,961
  • 8
  • 25
  • 36
  • Does this answer your question? [feedparser with timeout](https://stackoverflow.com/questions/9772691/feedparser-with-timeout) – Craig Dec 21 '22 at 16:36

0 Answers0