638

How can I spit out a flat list of recursive one-per-line paths?

For example, I just want a flat listing of files with their full paths:

/home/dreftymac/.
/home/dreftymac/foo.txt
/home/dreftymac/bar.txt
/home/dreftymac/stackoverflow
/home/dreftymac/stackoverflow/alpha.txt
/home/dreftymac/stackoverflow/bravo.txt
/home/dreftymac/stackoverflow/charlie.txt

ls -a1 almost does what I need, but I do not want path fragments, I want full paths.

dreftymac
  • 31,404
  • 26
  • 119
  • 182
  • 3
    **See also:** [tree](https://stackoverflow.com/a/3455675/42223) – dreftymac Feb 19 '19 at 12:21
  • 3
    tree -aflix --noreport but if you use tree and there are any symbolic links in the path you will have to deal with those or use an alternate solution from one of the suggested answers. – Vector Jul 29 '20 at 15:37

26 Answers26

734

Use find:

find .
find /home/dreftymac

If you want files only (omit directories, devices, etc):

find . -type f
find /home/dreftymac -type f
Jonathan Leffler
  • 730,956
  • 141
  • 904
  • 1,278
stefanB
  • 77,323
  • 27
  • 116
  • 141
  • 2
    can `ls` parameters like `--sort=extension` "redeemed" by this solution? – n611x007 Sep 28 '12 at 19:19
  • 1
    You can even use printf output in order to display needed contextual info (e.g. find . -type f -printf '%p %u\n') – xsubira Apr 14 '15 at 08:42
  • Can this be formatted with a falg? ie. python pprint.pprint(files) – Nathan majicvr.com Apr 13 '18 at 19:24
  • But `find` does not print file sizes :( I want `ls -alR` but I want each line to contain full path. – Shayan Aug 19 '20 at 08:08
  • 1
    @Shayan `find` with the `-printf` predicate allows you to do everyting `ls` does, and then some. However, it is not standard. You can use `find -exec stat {} \;` but unfortunately the options to `stat` are not standardized, either. – tripleee Jul 15 '21 at 07:48
  • 1
    ... In the end, unfortunately, the most portable solution might be a Perl or Python script. – tripleee Jul 15 '21 at 08:07
  • Thank for this.. I'll use `find \`pwd\`` to list evertthing from the current location, with full absolute paths. – Myobis Nov 30 '22 at 10:40
  • `find . -type f -printf '%BF_%BH:%BM %p\n'` gives a nice output with date first like `2019-03-18_10:15 /path/to/file` (gnu find has printf, not all do) – MERM Feb 08 '23 at 21:24
437

If you really want to use ls, then format its output using awk:

ls -R /path | awk '
/:$/&&f{s=$0;f=0}
/:$/&&!f{sub(/:$/,"");s=$0;f=1;next}
NF&&f{ print s"/"$0 }'
approxiblue
  • 6,982
  • 16
  • 51
  • 59
ghostdog74
  • 327,991
  • 56
  • 259
  • 343
  • <3 this command...perfect solution for what I need right now! thank you! – pcantalupo Aug 25 '14 at 11:46
  • 11
    Can someone please explain the above `awk` expressions? – Mert Nuhoglu Oct 10 '14 at 17:08
  • A bit late too. Indeed: how would symbolic links be handled in these cases? As I was thinking about this for an "home made" index of files, somewhat like locate on a single file list. But if I could get the full path, as it is indeed my files, it makes sense (to me) to show me the actual ls output, mainly I guess, dates and full path. – mariotti Feb 14 '15 at 18:08
  • The target would be: get the 12 or more old HD i have (full HD), make a full path 'ls' output, and use it for some fast 'grep' searching. It makes little sense to back them up, as it is loads of probably already duplicated data. Maybe there is some catalog alike tool but I am not sure if a simple grep for when I need it is better. I mean: a full backup system vs a couple of files... – mariotti Feb 14 '15 at 18:16
  • 2
    This solution doesn't omit the directories (each directory gets its own line) – JayB Sep 27 '16 at 18:40
  • Use it carefully !!! As said above it also list all directories and the `roots` directories as well so if you plan to use it to delete a list of files you need to clean this list ! – WonderLand Mar 16 '17 at 02:51
  • 2
    Thanks. Btw, would be nice to have this in 1 line for quick copy & paste. – Leo Ufimtsev Mar 09 '18 at 16:29
  • 1
    add this to your [.bashrc](https://askubuntu.com/questions/540683/what-is-a-bashrc-file-and-what-does-it-do) file: `function lsr () { ls -R "$@" | awk ' /:$/&&f{s=$0;f=0} /:$/&&!f{sub(/:$/,"");s=$0;f=1;next} NF&&f{ print s"/"$0 }' }` so you can use `lsr /path` to use this wherever – jessexknight Jul 25 '18 at 14:05
  • 2
    The other answer should be accepted. This is a good second answer and does indeed answer the question as written, but the best solution is the one using `find`. The intention of the question is clear (which is not that it has to done using `ls`), and this answer is better only for those who can't use `find`. The claim this answer is more "right" because it answers the *written* question instead of the *intended* one is pretty ridiculous. – Jasper Jun 18 '19 at 09:39
  • The correct answer is that you can't. This is just a waste of time. – TZubiri Apr 17 '20 at 04:50
  • 2
    Do you mind explaining the `awk` code? It looks like you are using a regex to catch lines that end in ":" (the "headers" with parent directory paths), but I get lost after that and definitely don't understand the part where the last field `NF` is being evaluated as true/false. Thanks! – Josh Sep 11 '20 at 23:17
  • The NF eval is for checking whether the line is blank. It could be replaced with /^$/ for example, but NF is computed anyways. I simplified the awk code as follows, as the provided version is buggy as it omits files at the top-level directory (I edited the answer, but not sure if my edits will be approved): `ls -R | awk '/:$/ {sub(/:$/,""); s=$0;next}{ print s "/" $0 }' ` – taltman Jan 09 '21 at 07:25
91

ls -ld $(find .)

if you want to sort your output by modification time:

ls -ltd $(find .)

others
  • 935
  • 6
  • 3
  • 16
    -bash: /bin/ls: Argument list too long – jperelli Mar 01 '12 at 17:37
  • 1
    +1 worked for me with 12106 files, and I could use the `--sort=extension` parameter of `ls` – n611x007 Sep 28 '12 at 19:22
  • 6
    Thanks. I wouldn't have thought by myself of that (nice and short) syntax - i would have used `find . -name "*" -exec ls -ld '{}' \;` (that one works whatever the number of files is), but your command is way shorter to write ;) – SRG Feb 27 '15 at 13:34
  • 2
    ls -ld $(find .) breaks for me if I'm listing a ntfs disk where files have spaces: ls: cannot access ./System: No such file or directory however find with quotes by @SRG works – kuz8 Dec 01 '16 at 02:49
  • 3
    A shorter alternative (depending on your needs) would be `find . -ls`. – IanS Jun 08 '18 at 10:14
  • SRG modified to `find . -name "*" -exec ls -ld -w1 -Q -x '{}' \;` shows file names only and quotes everything for ntfs space characters. – Max Aug 28 '20 at 22:13
68

Best command is: tree -fi

-f print the full path prefix for each file
-i don't print indentations

e.g.

$ tree -fi
.
./README.md
./node_modules
./package.json
./src
./src/datasources
./src/datasources/bookmarks.js
./src/example.json
./src/index.js
./src/resolvers.js
./src/schema.js

In order to use the files but not the links, you have to remove > from your output:

tree -fi |grep -v \>

If you want to know the nature of each file, (to read only ASCII files for example) try a while loop:

tree -fi |
grep -v \> |
while read -r first ; do 
    file "${first}"
done |
grep ASCII
tripleee
  • 175,061
  • 34
  • 275
  • 318
kerkael
  • 681
  • 5
  • 2
64

Oh, really a long list of answers. It helped a lot and finally, I created my own which I was looking for :

To List All the Files in a directory and its sub-directories:

find "$PWD" -type f

To List All the Directories in a directory and its sub-directories:

find "$PWD" -type d

To List All the Directories and Files in a directory and its sub-directories:

find "$PWD"
Sushant Verma
  • 899
  • 7
  • 8
  • 7
    And to filter by extension: `find "$PWD" -type f | grep '\.json$'` – Killroy Jan 15 '19 at 14:55
  • 4
    No Need of post-processing with grep, use -name in find like: find "$PWD" -type f -name *.json and if you want to delete the files listed : find "$PWD" -type f -name *.json -exec rm {} \; similarly, if you want to copy it then replace rm with cp and destination: -exec cp {} destination – Sushant Verma Jun 18 '19 at 08:41
  • must use "$PWD/" in my condition: find "$PWD/" -type f; – walknotes Aug 12 '20 at 01:05
  • 1
    And to list a particular file with its full path `find "$PWD/README.md"` – Paul Rougieux Nov 25 '20 at 10:39
62

Try the following simpler way:

find "$PWD"
kenorb
  • 155,785
  • 88
  • 678
  • 743
Ivan Alegre
  • 637
  • 5
  • 2
23
du -a

Handy for some limited appliance shells where find/locate aren't available.

Ry-
  • 218,210
  • 55
  • 464
  • 476
Rob D
  • 231
  • 2
  • 2
21

I don't know about the full path, but you can use -R for recursion. Alternatively, if you're not bent on ls, you can just do find *.

Justin Johnson
  • 30,978
  • 7
  • 65
  • 89
18

Using no external commands other than ls:

ls -R1 /path | 
  while read l; do case $l in *:) d=${l%:};; "") d=;; *) echo "$d/$l";; esac; done

Idelic
  • 14,976
  • 5
  • 35
  • 40
  • Unknown option '-1'. Aborting. – ilw Jul 07 '17 at 13:18
  • @ilw That's weird; I'd think `ls -1` is fairly [standard](https://pubs.opengroup.org/onlinepubs/9699919799/utilities/ls.html); but try just leaving it out if it's unsupported. The purpose of that option is to force `ls` to print one line per file but that's usually its behavior out of the box anyway. (But then of course, [don't use `ls` in scripts.](https://mywiki.wooledge.org/ParsingLs)) (Looking at the POSIX doco, this option was traditionally BSD only, but was introduced in POSIX in 2017.) – tripleee Apr 12 '21 at 04:50
  • All the subfolders will be in the list, not just files. – linguisticturn Jun 03 '22 at 21:14
14

find / will do the trick

apaderno
  • 28,547
  • 16
  • 75
  • 90
Dmitry
  • 3,740
  • 15
  • 17
13

Run a bash command with the following format:

find /path -type f -exec ls -l \{\} \;

Likewise, to trim away -l details and return only the absolute paths:

find /path -type f -exec ls \{\} \;
Denio Mariz
  • 1,065
  • 1
  • 10
  • 12
  • `find -ls` avoids running an external process for each file and is a lot easier to type. – tripleee Apr 12 '21 at 04:47
  • 1
    You don´t need the `-exec ls \{\} \;` part, since the default behavior of `find` is to print the full path. That is, `find /path -type f` does the job if you don´t need the file attributes from `ls -l`. – Denio Mariz Jul 14 '21 at 21:23
10

The easiest way for all you future people is simply:

du

This however, also shows the size of whats contained in each folder You can use awk to output only the folder name:

du | awk '{print $2}'

Edit- Sorry sorry, my bad. I thought it was only folders that were needed. Ill leave this here in case anyone in the future needs it anyways...

oers
  • 18,436
  • 13
  • 66
  • 75
6112115
  • 109
  • 1
  • 2
  • Interesting, because it shows me stuff I didn't know I wanted to know -- kind of like Google suggest. It turns out, I like knowing how much space each file takes. – Jake Toronto Feb 02 '15 at 18:04
9

With having the freedom of using all possible ls options:

find -type f | xargs ls -1

Grzegorz Luczywo
  • 9,962
  • 1
  • 33
  • 22
9

Don't make it complicated. I just used this and got a beautiful output:

ls -lR /path/I/need
halfer
  • 19,824
  • 17
  • 99
  • 186
KellyC
  • 179
  • 1
  • 2
8

I think for a flat list the best way is:

find -D tree /fullpath/to-dir/ 

(or in order to save it in a txt file)

find -D tree /fullpath/to-dir/ > file.txt
Dimitrios
  • 1,143
  • 11
  • 10
6

Here is a partial answer that shows the directory names.

ls -mR * | sed -n 's/://p'

Explanation:

ls -mR * lists the full directory names ending in a ':', then lists the files in that directory separately

sed -n 's/://p' finds lines that end in a colon, strip off the colon and print the line

By iterating over the list of directories, we should be able to find the directories as well. Still workin on it. It is a challenge to get the wildcards through xargs.

Kevin
  • 2,112
  • 14
  • 15
6

Adding a wildcard to the end of an ls directory forces full paths. Right now you have this:

$ ls /home/dreftymac/
foo.txt
bar.txt
stackoverflow
stackoverflow/alpha.txt
stackoverflow/bravo.txt
stackoverflow/charlie.txt

You could do this instead:

$ ls /home/dreftymac/*
/home/dreftymac/.
/home/dreftymac/foo.txt
/home/dreftymac/bar.txt
/home/dreftymac/stackoverflow:
alpha.txt
bravo.txt
charlie.txt

Unfortunately this does not print the full path for directories recursed into, so it may not be the full solution you're looking for.

koeselitz
  • 61
  • 1
  • 1
  • Also unfortunately you can't sudo ls with a wildcard (because the wildcard is expanded as the normal user). – andrew lorien Mar 09 '17 at 00:14
  • Also unfortunately, `ls` has a lot of pesky corner cases; see [parsing `ls`](http://mywiki.wooledge.org/ParsingLs) – tripleee Jul 15 '21 at 08:12
4

If the directory is passed as a relative path and you will need to convert it to an absolute path before calling find. In the following example, the directory is passed as the first parameter to the script:

#!/bin/bash

# get absolute path
directory=`cd $1; pwd`
# print out list of files and directories
find "$directory"
tripleee
  • 175,061
  • 34
  • 275
  • 318
John Keyes
  • 5,479
  • 1
  • 29
  • 48
4
tar cf - $PWD|tar tvf -             

This is slow but works recursively and prints both directories and files. You can pipe it with awk/grep if you just want the file names without all the other info/directories:

tar cf - $PWD|tar tvf -|awk '{print $6}'|grep -v "/$"          
Diego C Nascimento
  • 2,801
  • 1
  • 17
  • 23
RuleB
  • 49
  • 1
4

A lot of answers I see. This is mine, and I think quite useful if you are working on Mac.

I'm sure you know there are some "bundle" files (.app, .rtfd, .workflow, and so on). And looking at Finder's window they seem single files. But they are not. And $ ls or $ find see them as directories... So, unless you need list their contents as well, this works for me:

find . -not -name ".*" -not -name "." | egrep -v "\.rtfd/|\.app/|\.lpdf/|\.workflow/"

Of course this is for the working dir, and you could add other bundles' extensions (but always with a / after them). Or any other extensions if not bundle's without the /.

Rather interesting the ".lpdf/" (multilingual pdf). It has normal ".pdf" extension (!!) or none in Finder. This way you get (or it just counts 1 file) for this pdf and not a bunch of stuff…

Steve
  • 355
  • 3
  • 8
4

ls -lR is what you were looking for, or atleast I was. cheers

sivi
  • 10,654
  • 2
  • 52
  • 51
4

The realpath command prints the resolved path:

realpath *

To include dot files, pipe the output of ls -a to realpath:

ls -a | xargs realpath

To list subdirectories recursively:

ls -aR | xargs realpath

In case you have spaces in file names, man xargs recommends using the -o option to prevent file names from being processed incorrectly, this works best with the output of find -print0 and it starts to look a lot more complex than other answers:

find -print0 |xargs -0 realpath

See also Unix and Linux stackexchange question on how to list all files in a directory with absolute path.

Paul Rougieux
  • 10,289
  • 4
  • 68
  • 110
3

Recursive list of all files from current location:

ls -l $(find . -type f)

Pavlo Neiman
  • 7,438
  • 3
  • 28
  • 28
3

If you have to search on big memory like 100 Gb or more. I suggest to do the command tree that @kerkael posted and not the find or ls.

Then do the command tree with only difference that, I suggest, write the output in the file.

Example:

tree -fi > result.txt

After, do a grep command in file using a pattern like grep -i "*.docx" result.txt so you not lose a time and this way is faster for search file on big memory.

I did these commands on 270GB memory that I get a file txt taken 100MB. Ah, the time that taken for command tree was 14 minutes. enter image description here

Mirko Cianfarani
  • 2,023
  • 1
  • 23
  • 39
2

@ghostdog74: Little tweak with your solution.
Following code can be used to search file with its full absolute path.

sudo ls -R / | awk '
/:$/&&f{s=$0;f=0}
/:$/&&!f{sub(/:$/,"");s=$0;f=1;next}
NF&&f{ print s"/"$0 }' | grep [file_to_search]
tripleee
  • 175,061
  • 34
  • 275
  • 318
Chaitanya
  • 331
  • 2
  • 17
2

I knew the file name but wanted the directory as well.

find $PWD | fgrep filename

worked perfectly in Mac OS 10.12.1

Brian Burns
  • 20,575
  • 8
  • 83
  • 77
hpj
  • 29
  • 1