347

I'm trying to ftp a folder using the command line ftp client, but so far I've only been able to use 'get' to get individual files.

dotancohen
  • 30,064
  • 36
  • 138
  • 197
Charles Ma
  • 47,141
  • 22
  • 87
  • 101
  • 5
    the right answer is from Apr 6 '11 at 14:13 by lkuty. Don't use `mget`, it's not recursive at all. answer from Sep 22 '08 at 9:01 Thibaut Barrère is easier to understand but must add the option `-l 0` as mentioned in the comments – chriscatfr Nov 11 '12 at 22:12

12 Answers12

677

You could rely on wget which usually handles ftp get properly (at least in my own experience). For example:

wget -r ftp://user:pass@server.com/

You can also use -m which is suitable for mirroring. It is currently equivalent to -r -N -l inf.

If you've some special characters in the credential details, you can specify the --user and --password arguments to get it to work. Example with custom login with specific characters:

wget -r --user="user@login" --password="Pa$$wo|^D" ftp://server.com/

As pointed out by @asmaier, watch out that even if -r is for recursion, it has a default max level of 5:

-r
--recursive
    Turn on recursive retrieving.

-l depth
--level=depth
    Specify recursion maximum depth level depth.  The default maximum depth is 5.

If you don't want to miss out subdirs, better use the mirroring option, -m:

-m
--mirror
    Turn on options suitable for mirroring.  This option turns on recursion and time-stamping, sets infinite
    recursion depth and keeps FTP directory listings.  It is currently equivalent to -r -N -l inf
    --no-remove-listing.
Neuron
  • 5,141
  • 5
  • 38
  • 59
Thibaut Barrère
  • 8,845
  • 2
  • 22
  • 27
  • 127
    Better use `wget -m` (`--mirror`). `wget -r` is limited to a recursion depth of 5 by default. – asmaier Jul 07 '11 at 19:28
  • 13
    I had to use `--user` and `--password` too on Red Hat. My wget is: `GNU Wget 1.11.4 Red Hat modified` I wonder if it's a version thing or a distro thing... – devin Jan 17 '12 at 20:18
  • 62
    You can set infinite recursion level with `-l 0`, so no need of using `--mirror` which may have some unwanted side effects such as .listing files – Hnatt Mar 08 '12 at 21:52
  • You can enter the command via a shell script, if you would prefer not to have your password in the history. Remember to delete the script when your done. Alternatively, you can perform `history -c`. – SSH This Dec 10 '12 at 21:27
  • 29
    I use `wget --ask-password -rl 99 ftp://user@server.com`. This way the password is not visible with `ps` and does not remain in the history. Of course, by the nature of ftp it is sent unencrypted to the server. – Walter Tross Oct 25 '13 at 22:25
  • -b parameter is also very useful in this usage – Huseyin Yagli Aug 11 '14 at 20:13
  • 11
    Reminder for bash newbs: you'll have to use single quotes if your username or password have control characters (like `$`), e.g. `--user='user' --password='pa$$word'` – tobek Sep 15 '15 at 00:48
  • 1
    If you can't use wget use my ftp script https://github.com/thomasX/ftpBackup – user1484745 Aug 22 '17 at 15:57
181

Just to complement the answer given by Thibaut Barrère.

I used

wget -r -nH --cut-dirs=5 -nc ftp://user:pass@server//absolute/path/to/directory

Note the double slash after the server name. If you don't put an extra slash the path is relative to the home directory of user.

  • -nH avoids the creation of a directory named after the server name
  • -nc avoids creating a new file if it already exists on the destination (it is just skipped)
  • --cut-dirs=5 allows to take the content of /absolute/path/to/directory and to put it in the directory where you launch wget. The number 5 is used to filter out the 5 components of the path. The double slash means an extra component.
Neuron
  • 5,141
  • 5
  • 38
  • 59
Ludovic Kuty
  • 4,868
  • 3
  • 28
  • 42
  • 3
    Brilliant. The ability to skip files that already exist is great for catching up with the latest additions on a server migration. rsync is more efficient and more flexible, but sometimes that option is just not available, and only FTP can be used. – Jason Oct 23 '12 at 19:50
  • 2
    I don't always trust "skip existing files" because one of either might be incomplete or different in size and contents but good he mentioned the option – Daniel W. Nov 14 '13 at 08:48
  • Even today, I still use this wget command when unable to use rsync. The -nc and --cut-dirs is so useful! – Rob W Jan 27 '15 at 19:11
  • 5
    Awesome! And if you don't want to put your password on the command line, you can use `--ftp-user=USER` and `--ask-password`. – shoover Jun 05 '15 at 23:03
  • It doesn't create directory structure at all. So if you have files with identical names in different directories, they will be skipped... – Quentin Oct 13 '16 at 18:22
25
ncftp -u <user> -p <pass> <server>
ncftp> mget directory
Vinko Vrsalovic
  • 330,807
  • 53
  • 334
  • 373
  • I couldn't login to a FTP-server using the params, but using the structure `open ftp://USERNAME:PASSWORD@HOST` after starting ncftp… – feeela Oct 07 '11 at 10:22
  • definitely more reliable than `wget`, and faster too in TAR mode. Thanks! – lencinhaus Nov 22 '13 at 11:04
  • 3
    This doesn't work as stated on Ubuntu 14.04. The syntax that worked for me was "get -R directory" instead of mget. – Ivan Sep 03 '14 at 16:39
  • Worked for me on Ubuntu 14.04. I didnt have to mget it was all just there. – ashley Dec 08 '15 at 11:24
23

If lftp is installed on your machine, use mirror dir. And you are done. See the comment by Ciro below if you want to recursively download a directory.

Dilawar
  • 5,438
  • 9
  • 45
  • 58
17

If you can use scp instead of ftp, the -r option will do this for you. I would check to see whether you can use a more modern file transfer mechanism than FTP.

Greg Hewgill
  • 951,095
  • 183
  • 1,149
  • 1,285
  • 2
    I voted this up because it was exactly my first thought, even though it doesnt strictly answer the question as-is. – metao Sep 22 '08 at 09:04
  • Which one do you propose @greg-hewgill ? I have similar problem – Michal Gonda Aug 25 '16 at 11:25
  • 1
    `scp` will still attempt to use ssh, which won't work if you only have FTP credentials. Feels weird responding to a comment from 2008 saying the technology I'm stuck with isn't modern :( Good old 2008, I turned 18 on the day you posted your comment. – ᴍᴇʜᴏᴠ Jun 17 '17 at 19:34
12

Use WGet instead. It supports HTTP and FTP protocols.

wget -r ftp://mydomain.com/mystuff

Good Luck!

reference: http://linux.about.com/od/commands/l/blcmdl1_wget.htm

Jason Stevenson
  • 4,004
  • 3
  • 29
  • 49
7

There is 'ncftp' which is available for installation in linux. This works on the FTP protocol and can be used to download files and folders recursively. works on linux. Has been used and is working fine for recursive folder/file transfer.

Check this link... http://www.ncftp.com/

Cypher
  • 271
  • 1
  • 4
3

If you can, I strongly suggest you tar and bzip (or gzip, whatever floats your boat) the directory on the remote machine—for a directory of any significant size, the bandwidth savings will probably be worth the time to zip/unzip.

Hank Gay
  • 70,339
  • 36
  • 160
  • 222
  • Maybe in 2008, but in 2013 bandwidth doesn't matter anymore and you might have FTP but not console access :-) – Daniel W. Nov 14 '13 at 08:49
3

If you want to stick to command line FTP, you should try NcFTP. Then you can use get -R to recursively get a folder. You will also get completion.

Jazz
  • 5,747
  • 5
  • 43
  • 55
3

wget -r ftp://url

Work perfectly for Redhat and Ubuntu

Phillip
  • 31
  • 1
2

You should not use ftp. Like telnet it is not using secure protocols, and passwords are transmitted in clear text. This makes it very easy for third parties to capture your username and password.

To copy remote directories remotely, these options are better:

  • rsync is the best-suited tool if you can login via ssh, because it copies only the differences, and can easily restart in the middle in case the connection breaks.

  • ssh -r is the second-best option to recursively copy directory structures.

To fetch files recursively, you can use a script like this: https://gist.github.com/flibbertigibbet/8165881

See:

Tilo
  • 33,354
  • 5
  • 79
  • 106
  • not secure, just ftp – Jos Sep 25 '15 at 16:34
  • it is 2015. FTP should not be used. – Tilo Sep 29 '15 at 03:22
  • 4
    I agree. And suggestions for better security should always be given. But the question was about FTP, so saying one should not use it is not helping – Jos Oct 08 '15 at 08:43
  • I respectfully disagree. They are using the wrong tool for the job. They should learn to use secure and current tools, rather than 1980's ftp. More specifically, nobody should run an ftp server anymore :P – Tilo Oct 09 '15 at 05:14
-6

toggle the prompt by PROMPT command.

Usage:

ftp>cd /to/directory    
ftp>prompt    
ftp>mget  *
abatishchev
  • 98,240
  • 88
  • 296
  • 433
Rohit
  • 53
  • 1
  • 1