3

Currently, I'm using shell script to download the files from FTP server. Ansible will execute my script and continue other automated jobs.

Please let me know the best way to do this in Ansible playbook using "get_url" instead of "shell". The following syntax is working only to download the single file but my requirements are to download the multiple files and directories.

Appreciated your help.

- name: FTP Download
  get_url: url=ftp://username:password@ftp.server.com/2016/03/value/myfile dest=/home/user/03/myfile1
  register: get_url_result
miki
  • 201
  • 2
  • 3
  • 7
  • 2
    It looks like `get_url` does not support recursive download. I think `wget` is best option to download the directory/my requirements. I am using `- shell: wget -r -np -nH --cut-dirs=1 ftp://username:password@ftp.server.com/2016/03/*` – miki Mar 27 '16 at 23:49

2 Answers2

2

According to the get_url documentation, and as far as I know, get_url does not support recursive download.

One possibility, as @helloV suggested, is to loop through a list with with_items. But this would require to have a static list of files or to obtain this list somehow, probably with wget.

Consequently, you could simply use wget -m directly in order to recursively download all files with one task. See How to recursively download a folder via FTP on Linux.

Community
  • 1
  • 1
Wtower
  • 18,848
  • 11
  • 103
  • 80
1

Use a list for urls (or a dict for urls and dest) and then loop through it using with_items.

Give an example for few URLs and destination files.

helloV
  • 50,176
  • 7
  • 137
  • 145
  • http://stackoverflow.com/questions/113886/how-do-you-recursively-ftp-a-folder-in-linux – helloV Mar 26 '16 at 15:21
  • Thanks for the reply, what is the best way to download directory and sub directories from single " url:ftp.server.com" ... I am able to do using "shell: wget -r --no-parent" . Is there any options available in "get_url" – miki Mar 26 '16 at 15:25