40

How to download a file with wget using multiple connection in which each connection downloads a part of the file?

user2789031
  • 403
  • 1
  • 4
  • 5
  • See also https://stackoverflow.com/questions/3430810/wget-download-with-multiple-simultaneous-connections – Nemo Jun 21 '17 at 20:52
  • 1
    Does this answer your question? [Multiple simultaneous downloads using Wget?](https://stackoverflow.com/questions/3430810/multiple-simultaneous-downloads-using-wget) – OverShifted Jun 30 '21 at 17:36

3 Answers3

45

use aria2

 aria2c -x 16 [url] #where 16 is the number of connections

OR

Just repeat the wget -r -np -N [url] for as many threads as you need. This isn’t pretty and there are surely better ways to do this, but if you want something quick and dirty it should do the trick.

Jayesh Bhoi
  • 24,694
  • 15
  • 58
  • 73
  • 7
    The solution you gave for wget downloads a single file multiple times resulting in multiple copies, but I want to download a single file using multiple connection, each of which downloads its part of file. – user2789031 Mar 01 '14 at 13:18
  • 1
    `wget` solution doesn't download one file in multiple threads: The options used `-r` is recursive, `-np` (`--no-parent`) — don't ascend to the parent directory, `-N` (`--timestamping`) — don't re-retrieve files unless newer than local. But `wget` would definitely work if you're downloading a mirror of a site. – Alexey Ivanov Apr 07 '15 at 14:44
  • @Trix -x, --max-connection-per-server=NUM The maximum number of connections to one server for each download. Possible Values: 1-16 Default: 1 Tags: #basic, #http, #ftp – mlapaglia Feb 09 '16 at 14:38
  • perhaps the flag -nc was intended, not -np – awiebe Dec 06 '17 at 11:11
  • 1
    The link in the answer is 404 – M.M Jan 04 '21 at 09:53
  • On Ubuntu you can install aria2 with `sudo apt-get install aria2` – collimarco Jan 12 '23 at 16:20
30
sudo apt-get install axel
axel -n 5 url

Would do the work!

Axel is a lightweight program that helps with the downloads and supports multiple connections.

Want to resume download after a disconnection? No problem, just repeat the command and axel will take care of it.

Ash
  • 3,428
  • 1
  • 34
  • 44
  • 1
    Not very useful with long S3 pre-signed URLs due to `Can't handle URLs of length over 1024` limitation – Matt Kucia Sep 29 '21 at 11:40
  • I have the same issue with S3 pre-signed URLs. This command is totally useless and it doesn't make sense to have a 1024 character limit on a URL in 2023... – collimarco Jan 12 '23 at 15:45
-1

Axel supports visual indicator. axel -a -n [X parts] [URL]

  • What is the additional insight you contribute on top of https://stackoverflow.com/a/35307402/7733418 ? – Yunnosch Sep 12 '22 at 10:04