0

I'm trying to fetch all files within all directories on our SAN. I'm starting with my local to test out how I want to do it. So, at my Documents directory: ls -sR > documents_tree.txt

With just my local, that's fine. It gives the exact output I want. But since I'm doing it on our SAN, I'm going to have to compress on-the-fly, and I'm not sure the best way of doing this. So far I have: ls -sR > documents_tree.txt | tar -cvzf documents_tree.tgz documents_tree.txt

When I try to check the output, it is impossible for me to un-tar the file using tar -xvf documents_tree.tar after I have gunzipped it.

So, what is the correct way to compress on-the-fly? How can I accurately check my work? Will this work when performing the same process on a SAN?

tbw875
  • 359
  • 3
  • 5
  • 12
  • You're not tarring the files, you're just tarring the list of filenames. Is that what you really want? – Barmar Mar 20 '17 at 17:59
  • What exactly do you want to do? The snippet you show writes the output of `ls -sR` to a file and simultaneously starts a `tar` process that writes this file to a gzipped tar archive. You have a race between `ls` and `tar` on whether the former finishes before the latter start reading the file. I'm not sure what the interplay between the redirection and pipeline is (they both act on stdout of `ls`). If you just want to archive a directory, wouldn't `tar czvf directory.tgz directory/` do the trick? – Roland W Mar 20 '17 at 18:00
  • There's nothing special about doing this on a SAN. The SAN just looks like ordinary files to applications. – Barmar Mar 20 '17 at 18:01
  • By the way: never use the output of `ls` for anything but human reading. Use `find` instead, or `find -print0` if you may have strange filenames. – Roland W Mar 20 '17 at 18:02
  • See the accepted answer on this question [find-files-and-tar-them](http://stackoverflow.com/questions/5891866/find-files-and-tar-them-with-spaces) – mattmc Mar 20 '17 at 18:05
  • @Barmar @RolandW Thanks. I don't need to tar the actual files, just a list of the files and directories. The output I see from `ls -sR` is exactly what I want. The point of the rest of it is that the SAN is huge, so even if its just a list of all of it, I'll want to compress the output of the `ls`. – tbw875 Mar 20 '17 at 18:06

2 Answers2

0

You don't need to use tar to compress a single file, just use gzip:

ls -sR | gzip > documents_tree.txt.gz

You can then use gunzip documents_tree.txt to uncompress it, or tools like gzcat and zless to view it without having to uncompress it first.

Barmar
  • 741,623
  • 53
  • 500
  • 612
  • Thanks, this worked. I wish `ls` had a verbose mode....I dont know if its actually doing anything, or this san is actually that huge. – tbw875 Mar 20 '17 at 19:57
  • You could use `ls -sR | tee /dev/tty | gzip > documents_tree.txt.gz`. Then you'll see the `ls` output in parallel with it being written to the gzip file. – Barmar Mar 20 '17 at 19:58
0

Building upon your comment on the OP and using your initial command, the following works for me:

ls -sR > documents_tree.txt && tar -cvzf documents_tree.tgz documents_tree.txt

mattmc
  • 479
  • 5
  • 14