I got a little headless LAMP-webserver running and I am also using the server for downloading files from the internet. At the moment I have to login via SSH and start the download via wget. The files are partially really large (exceeding 4GB). A nice solution would be to use a python cgi to add a link to the queue and let python do the rest. I already know how to download files from the net (like here: Download file via python) and I know how to write the python-cgi (or wsgi). The thing is, that the script needs to run continuously, which would mean to keep the connection alive - which would be pretty useless. Therefore I think I need some kind of a background solution. Help or hints would be much appreciated.
Thanks in advance & best regards!