0

I got a little headless LAMP-webserver running and I am also using the server for downloading files from the internet. At the moment I have to login via SSH and start the download via wget. The files are partially really large (exceeding 4GB). A nice solution would be to use a python cgi to add a link to the queue and let python do the rest. I already know how to download files from the net (like here: Download file via python) and I know how to write the python-cgi (or wsgi). The thing is, that the script needs to run continuously, which would mean to keep the connection alive - which would be pretty useless. Therefore I think I need some kind of a background solution. Help or hints would be much appreciated.

Thanks in advance & best regards!

Skar
  • 1
  • 1
  • 1
    you could use a scheduler (cron on linux, task scheduler on windows) or implement a daemon. Your question is so broad, that I cannot give further advice at this stage. – M.Rau Aug 15 '18 at 08:53
  • Thanks for your quick answer. I think I will try the daemon version. Once it does it's job, I will update my answer as it may be of interest for others. – Skar Aug 15 '18 at 11:30

0 Answers0