I have an alias for aria2 that downloads from an input file from an ftp server. This is how I had it setup.
aria2c --max-concurrent-downloads=1 --max-connection-per-server=6 --ftp-user=<user> --ftp-passwd=<password> --dir=/home/<username>/Downloads --input-file=/home/<username>/scripts/downloads.txt
I ran into an issue just now, not sure why as it never happened before, where it wouldn't continue and would try to re-download the files as .1.
So I read the man page and saw this answer I see there is --continue, so I just changed it to
aria2c --max-concurrent-downloads=1 --max-connection-per-server=6 --continue=true --ftp-user=<user> --ftp-passwd=<password> --dir=/home/<username>/Downloads --input-file=/home/<username>/scripts/downloads.txt
So it works now, but my only issue is that it has to loop through the input file and check each download, making sure they're downloaded, until it finds where it left off. So for only 4 files downloaded out of 10 (all are under 1gb) it started at 15:51:52 and only found the aria2 file(#5/10) to resume at 16:00:16. Sometimes I'm dealing with 20+ files, or files larger than 1gb, and I'm unsure if that'll also change for the download size itself. This could make a large delay of like an hour potentially. Is there anyway to force it to search for an existing aria2 file in the directory and immediately start there, or do I just have to deal with it or remove finished files from the text file to avoid this?