I would like to send jobs (files to download) to aria2c via stdin pipeline.
aria2c -i -
aria2c
has failed to start downloads "at once" when new job had beed fed every few seconds. For 10+ short jobs it started download after input pipe has been closed. Can it be fixed via command line options?
Sample shell script for tests [real script downloads over 50 URLs]:
#!/bin/sh
while read URL OUT; do
echo $URL
[ "$OUT" != "" ] && echo " out=$OUT"
sleep 1
done <<END | aria2c --deferred-input true -i -
http://example.com/
http://example.net/
http://example.org/
END
-i -
and "not all at once" pipe - Issue #1161
P.S. What I really like is "download server" without access for other hosts or other local users. Something like "RPC via unix sockets" (access control via file permissions) would be acceptable.