Right now script open URL and check for RSS feeds, but there is problem. Let's say site hang-up, it's like loading but not giving any information. So my script too hang-up and it doesn't retrieve any RSS feeds. After that there is only solution - restart python script. Is there any way set a timeout or something on connection, so if first attempt fails, it trys after 60 seconds(or less).
def request_refresh(self):
#Open file for RSS entries/duplicates
FILE = open(self.request_entries, "r")
filetext = FILE.read()
FILE.close()
for feed in feeds['request']:
d = feedparser.parse(feed)
for entry in d.entries:
#title url and description codes goes here
#Write RSS feed in log file(URL|title)
FILE = open(self.request_entries, "a")
FILE.write('{}\n'.format(id_request))
FILE.close()
#Start checking RSS feeds again
threading.Timer(5.0, self.request_refresh).start()