553
  • Operating system: Linux

  • Filesystem type: ext3

  • Preferred solution: Bash (script/one-liner), Ruby, or Python

I have several directories with several subdirectories and files in them. I need to make a list of all these directories that is constructed in a way such that every first-level directory is listed next to the date and time of the latest created/modified file within it.

To clarify, if I touch a file or modify its contents a few subdirectory levels down, that timestamp should be displayed next to the first-level directory name. Say I have a directory structured like this:

./alfa/beta/gamma/example.txt

and I modify the contents of the file example.txt, I need that time displayed next to the first-level directory alfa in human readable form, not epoch. I've tried some things using find, xargs, sort and the like, but I can't get around the problem that the filesystem timestamp of 'alfa' doesn't change when I create/modify files a few levels down.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
fredrik
  • 5,655
  • 3
  • 15
  • 7
  • 1
    If you can take the pain of building it, https://github.com/shadkam/recentmost can be used. – user3392225 Mar 07 '14 at 10:37
  • Instead of solutions such as a -R switch, I just see bulk here. – neverMind9 Nov 19 '18 at 13:17
  • 1
    @user3392225 A fork of github / shadkam / recentmost can be found at https://github.com/ConradHughes/recentmost with the `-0` option to use with `find`'s `-print0` – Setaa Jun 27 '20 at 02:22
  • Related questions: [How do I change folder timestamps recursively to the newest file?](https://unix.stackexchange.com/q/1524/307359) [How to make directory modification date change when files from that dir change?](https://unix.stackexchange.com/q/84274/307359) – M Imam Pratama May 17 '22 at 08:56

22 Answers22

620

Try this one:

#!/bin/bash
find $1 -type f -exec stat --format '%Y :%y %n' "{}" \; | sort -nr | cut -d: -f2- | head

Execute it with the path to the directory where it should start scanning recursively (it supports filenames with spaces).

If there are lots of files it may take a while before it returns anything. Performance can be improved if we use xargs instead:

#!/bin/bash
find $1 -type f -print0 | xargs -0 stat --format '%Y :%y %n' | sort -nr | cut -d: -f2- | head

which is a bit faster.

AppleDash
  • 1,544
  • 2
  • 13
  • 27
Heppo
  • 6,257
  • 1
  • 13
  • 2
  • 152
    Your "fast method" should also be able to use print0 to support spaces and even linefeeds in filenames. Here's what I use: `find $1 -type f -print0 | xargs -0 stat --format '%Y :%y %n' | sort -nr | cut -d: -f2- | head` This still manages to be fast for me. – Dan Jun 20 '12 at 23:12
  • 3
    Some directories I was looking in didn't allow me to `stat` them, so I made the following changes (to the 'fast' one) so I didn't have to see the errors in my final output. `find ${1} -type f | xargs stat --format '%Y :%y %n' 2>/dev/null | sort -nr | cut -d: -f2-` – TJ L Dec 14 '12 at 14:13
  • This was immensely helpful in trying to figure out why my NAS' hard drive kept waking up. Could anyone perhaps recommend a similar alternative which limits the results to within x-days? (i.e. the same type of listing, but only for files modified in the last 24hrs)? – J23 Jan 01 '13 at 22:34
  • 22
    On Mac OS X it's not GNU's stat so command fails. You have to `brew install coreutils` and use `gstat` instead of `stat` – CharlesB Mar 28 '13 at 10:56
  • 41
    You don't need to run `stat` since `find PATH -type f -printf "%T@ %p\n"| sort -nr` does the job. It's also a bit faster that way. – n.r. Jun 16 '13 at 03:49
  • To find all files that file status was last changed 5 minutes ago: `find -cmin -5` – iman Oct 23 '13 at 10:42
  • Also if you want to show more than how much it shows by default, throw -n # on the 'head' command. – Xedecimal Apr 27 '14 at 21:06
  • 12
    On Mac OS X, without installing gstat or anything else, you can do: `find PATH -type f -exec stat -f "%m %N" "{}" \; | sort -nr | head` – cobbzilla Jan 07 '16 at 23:39
  • On debian jessie `stat: missing operand` – Dimitri Kopriwa Jul 14 '16 at 07:59
  • in some stat versions "--format" option doesn't work, so please use this one: find $1 -type f -exec stat -c '%Y :%y %n' "{}" \; | sort -nr | cut -d: -f2- | head – Eljah Oct 21 '20 at 20:01
  • I like this solution because instead of `find` I can use `ripgrep` to respect `.gitignore`: `rg --files "$1" -0 | xargs -0r stat --format '%Y :%n' | sort -nr | cut -d: -f2- | head` – Maxim Suslov May 13 '21 at 02:44
  • If you need a simpler output for the modification date you can truncate it with `--format '%Y :%.19y %n'` – Manuel Rozier Aug 19 '21 at 14:20
  • Is there a way to make nautilus use this method when sorting by date? i.e.: Nautlius ignores if a subfolder was modified and makes sorting by date somewhat useless for large projects... – Louis Gagnon Sep 10 '21 at 08:09
  • 1
    You can also add `pv` by changing `| sort -nr |` to `| pv -l | sort -nr |`. This will give a nice progress and count of how many files have been found. Gives you a good indication as well if you know roughly how many files you are scanning. – Sarke Oct 12 '21 at 12:40
281

To find all files whose file status was last changed N minutes ago:

find -cmin -N

For example:

find -cmin -5

Use -ctime instead of -cmin for days:

find -ctime -3

On FreeBSD and MacOS: You can also use -ctime n[smhdw] for seconds, minutes, hours, days, and weeks. Days is the default if no unit is provided.

Examples:

# FreeBSD and MacOS only:
find . -ctime -30s
find . -ctime -15
find . -ctime -52w
iman
  • 21,202
  • 8
  • 32
  • 31
55

GNU find (see man find) has a -printf parameter for displaying the files in Epoch mtime and relative path name.

redhat> find . -type f -printf '%T@ %P\n' | sort -n | awk '{print $2}'
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
user2570243
  • 551
  • 4
  • 2
  • 3
    Thanks! This is the only answer that is fast enough to search through my very wide directory structure in a reasonable time. I pass the output through `tail` to prevent thousands of lines from being printed in the output. – sffc Oct 18 '13 at 03:52
  • 11
    Another comment: the `awk '{print $2}'` part seems to cause issues when there are filenames with spaces. Here is a solution using `sed` instead, and it also prints the time in addition to the path: `find . -type f -printf '%T@ %Tc %P\n' | sort -n | tail | sed -r 's/^.{22}//'` – sffc Oct 18 '13 at 04:04
  • 3
    I think it should be sort -rn – Bojan Dević Mar 24 '15 at 11:46
  • 2
    The -printf variant is far quicker than calling a 'stat' process each time - it cut hours off my backup job(s). Thanks for making me aware of this. I avoided the awk/sed thing as I'm only concerned about the last update within the tree - so X=$(find /path -type f -printf '%T %p\n' | grep -v something-I-don-tcare-about | sort -nr | head -n 1) and a echo ${X#*" "} worked well for me (give me stuff up to the first space) – David Goodwin Feb 04 '16 at 16:19
  • 2
    All will not works if filename across multiple line. Use `touch "lalab"` to create such file. I think unix utilities design has big flaw about filename. – 林果皞 Apr 20 '16 at 06:48
  • @林果皞 true, but a) if your nodes contain newlines, you're seeking for trouble; b) you can use find's `-print0` flag to output null-separated items, and process them accordingly; or in this case, `-printf ...\0'` as opposed to `-printf ...\n'` – laur Jul 09 '22 at 14:55
39

I shortened Daniel Böhmer's awesome answer to this one-liner:

stat --printf="%y %n\n" $(ls -tr $(find * -type f))

If there are spaces in filenames, you can use this modification:

OFS="$IFS";IFS=$'\n';stat --printf="%y %n\n" $(ls -tr $(find . -type f));IFS="$OFS";
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
slashdottir
  • 7,835
  • 7
  • 55
  • 71
  • How about this: IFS=$'\n'; stat --printf="%y %n\n" $(ls -tr $(find . -type f)) – slashdottir Apr 28 '14 at 19:02
  • 3
    This will not work if you have a very large number of files. the answers that use xargs solve that limit. – carl verbiest Jan 27 '15 at 14:32
  • @carlverbiest indeed a large number of files will break slashdottir's solution. Even xargs-based solutions will be slow then. [user2570243's solution](https://stackoverflow.com/a/17580855/1429390) is best for big filesystems. – Stéphane Gourichon Dec 14 '17 at 18:38
  • `IFS=$'\n'` isn't safe in any event when handling filenames: Newlines are valid characters in filenames on UNIX. Only the NUL character is guaranteed not to be present in a path. – Charles Duffy Sep 13 '18 at 17:10
18

Try this:

#!/bin/bash
stat --format %y $(ls -t $(find alfa/ -type f) | head -n 1)

It uses find to gather all files from the directory, ls to list them sorted by modification date, head for selecting the first file and finally stat to show the time in a nice format.

At this time it is not safe for files with whitespace or other special characters in their names. Write a commend if it doesn't meet your needs yet.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Daniel Böhmer
  • 14,463
  • 5
  • 36
  • 46
  • 2
    halo: I like your answer, it works well and prints out the correct file. I doesn't help me however since there are too many sublevels in my case. So I get "Argument list too long" for ls... and xargs wouldn't help in this case either. I'll try something else. – fredrik Apr 12 '11 at 08:34
  • In that case it's a bit more complex and will need some real program. I will hack some Perl. – Daniel Böhmer Apr 12 '11 at 08:36
  • 1
    I solved this using PHP instead. A recursive function that descends through the filesystem tree and stores the time of the most recently modified file. – fredrik Apr 19 '11 at 11:33
  • 1
    On MacOS I needed to use `stat $(ls -t $(find alfa/ -type f) | head -n 10)`. `--format` would be `-f`, but there is no `%y` and I didn't bother finding a replacement. – Michael Bolli Jun 23 '23 at 20:20
12

Ignoring hidden files — with nice & fast time stamp

Here is how to find and list the latest modified files in a directory with subdirectories. Hidden files are ignored on purpose. Whereas spaces in filenames are handled well — not that you should use those! The time format can be customised.

$ find . -type f -not -path '*/\.*' -printf '%TY.%Tm.%Td %THh%TM %Ta %p\n' |sort -nr |head -n 10

2017.01.25 18h23 Wed ./indenting/Shifting blocks visually.mht
2016.12.11 12h33 Sun ./tabs/Converting tabs to spaces.mht
2016.12.02 01h46 Fri ./advocacy/2016.Vim or Emacs - Which text editor do you prefer?.mht
2016.11.09 17h05 Wed ./Word count - Vim Tips Wiki.mht

More find galore can be found by following the link.

Serge Stroobandt
  • 28,495
  • 9
  • 107
  • 102
11

This command works on Mac OS X:

find "$1" -type f -print0 | xargs -0 gstat --format '%Y :%y %n' | sort -nr | cut -d: -f2- | head

On Linux, as the original poster asked, use stat instead of gstat.

This answer is, of course, user37078's outstanding solution, promoted from comment to full answer. I mixed in CharlesB's insight to use gstat on Mac OS X. I got coreutils from MacPorts rather than Homebrew, by the way.

And here's how I packaged this into a simple command ~/bin/ls-recent.sh for reuse:

#!/bin/bash
# ls-recent: list files in a directory tree, most recently modified first
#
# Usage: ls-recent path [-10 | more]
#
# Where "path" is a path to target directory, "-10" is any argument to pass
# to "head" to limit the number of entries, and "more" is a special argument
# in place of "-10" which calls the pager "more" instead of "head".
if [ "more" = "$2" ]; then
   H=more; N=''
else
   H=head; N=$2
fi

find "$1" -type f -print0 |xargs -0 gstat --format '%Y :%y %n' \
    |sort -nr |cut -d: -f2- |$H $N
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Jim DeLaHunt
  • 10,960
  • 3
  • 45
  • 74
  • 2
    On OS X yosemite; I get error: find: ftsopen: No such file or directory – Reece Dec 08 '16 at 00:09
  • Interesting. What command did you type (with parameters)? And what were the names of the files in that directory? And if you created your own version of `~/bin/ls-recent.sh`, have you carefully checked the script for differences? – Jim DeLaHunt Dec 08 '16 at 22:50
  • 13
    for those who don't want to install anything on Mac OS X: `find . -exec stat -f '%m%t%Sm %N' {} + | sort -n | cut -f2-` – Jake Apr 30 '17 at 21:43
  • @Jake: I think your comment should be promoted to a full answer. This is what Mac users are looking for. Thank you! – Andreas Rayo Kniep Jul 23 '20 at 02:48
  • What I ended up using based on @Jake: `find . -type f -exec stat -f '%m%t%Sm %N' {} + | sort -nr | cut -f2- | grep -v ".DS_Store" | head -10`. The changes are a reverse sort to have most recent at the top and only keep top-10 most recent through `head`, as well as filtering out directories and DS_Store files with `-type f` and `grep`. – Bar Nov 14 '22 at 19:08
10

This is what I'm using (very efficient):

function find_last () { find "${1:-.}" -type f -printf '%TY-%Tm-%Td %TH:%TM %P\n' 2>/dev/null | sort | tail -n "${2:-10}"; }

PROS:

  • it spawns only 3 processes no matter how many files are scanned
  • works with filenames containing spaces
  • works for large number of files

USAGE:

find_last [dir [number]]

where:

  • dir - a directory to be searched [current dir]
  • number - number of newest files to display [10]

Output for find_last /etc 4 looks like this:

2019-07-09 12:12 cups/printers.conf
2019-07-09 14:20 salt/minion.d/_schedule.conf
2019-07-09 14:31 network/interfaces
2019-07-09 14:41 environment
Seweryn Niemiec
  • 1,123
  • 2
  • 13
  • 19
5

Both the Perl and Python solutions in this post helped me solve this problem on Mac OS X:

How to list files sorted by modification date recursively (no stat command available!)

Quoting from the post:

Perl:

find . -type f -print |
perl -l -ne '
    $_{$_} = -M;  # store file age (mtime - now)
    END {
        $,="\n";
        print sort {$_{$b} <=> $_{$a}} keys %_;  # print by decreasing age
    }'

Python:

find . -type f -print |
python -c 'import os, sys; times = {}
for f in sys.stdin.readlines(): f = f[0:-1]; times[f] = os.stat(f).st_mtime
for f in sorted(times.iterkeys(), key=lambda f:times[f]): print f'
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
William Niu
  • 15,798
  • 7
  • 53
  • 93
4

Here is one version that works with filenames that may contain spaces, newlines, and glob characters as well:

find . -type f -printf "%T@ %p\0" | sort -zk1nr
  • find ... -printf prints the file modification time (Epoch value) followed by a space and \0 terminated filenames.
  • sort -zk1nr reads NUL terminated data and sorts it reverse numerically

As the question is tagged with Linux, I am assuming GNU Core Utilities are available.

You can pipe the above with:

xargs -0 printf "%s\n"

to print the modification time and filenames sorted by modification time (most recent first) terminated by newlines.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
anubhava
  • 761,203
  • 64
  • 569
  • 643
3

I'm showing this for the latest access time, and you can easily modify this to do latest modification time.

There are two ways to do this:


  1. If you want to avoid global sorting which can be expensive if you have tens of millions of files, then you can do (position yourself in the root of the directory where you want your search to start):

     Linux> touch -d @0 /tmp/a;
     Linux> find . -type f -exec tcsh -f -c test `stat --printf="%X" {}` -gt  `stat --printf="%X" /tmp/a`  ; -exec tcsh -f -c touch -a -r {} /tmp/a ; -print
    

    The above method prints filenames with progressively newer access time and the last file it prints is the file with the latest access time. You can obviously get the latest access time using a "tail -1".

  2. You can have find recursively print the name and access time of all files in your subdirectory and then sort based on access time and the tail the biggest entry:

     Linux> \find . -type f -exec stat --printf="%X  %n\n" {} \; | \sort -n | tail -1
    

And there you have it...

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Sean
  • 31
  • 1
3

I have this alias in my .profile that I use quite often:

$ alias | grep xlogs
xlogs='sudo find . \( -name "*.log" -o -name "*.trc" \) -mtime -1 | sudo xargs ls -ltr --color | less -R'

So it does what you are looking for (with exception it doesn't traverse change date/time multiple levels) - looks for latest files (*.log and *.trc files in this case); also it only finds files modified in the last day, and then sorts by time and pipes the output through less:

sudo find . \( -name "*.log" -o -name "*.trc" \) -mtime -1 | sudo xargs ls -ltr --color | less -R

PS.: Notice I don't have root on some of the servers, but always have sudo, so you may not need that part.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Tagar
  • 13,911
  • 6
  • 95
  • 110
  • How is this "exactly what you are looking for"? The OP wrote a good explanation of what he wanted, and this totally ignores it. – hmijail Nov 04 '17 at 11:29
  • thanks for pointing to that. you're correct - this method doesn't go multiple levels to get change date/time, it only shows date/time of directories' files within it. edited my answer. – Tagar Nov 04 '17 at 23:17
2

This should actually do what the OP specifies:

One-liner in Bash:

$ for first_level in `find . -maxdepth 1 -type d`; do find $first_level -printf "%TY-%Tm-%Td %TH:%TM:%TS $first_level\n" | sort -n | tail -n1 ; done

which gives output such as:

2020-09-12 10:50:43.9881728000 .
2020-08-23 14:47:55.3828912000 ./.cache
2018-10-18 10:48:57.5483235000 ./.config
2019-09-20 16:46:38.0803415000 ./.emacs.d
2020-08-23 14:48:19.6171696000 ./.local
2020-08-23 14:24:17.9773605000 ./.nano

This lists each first-level directory with the human-readable timestamp of the latest file within those folders, even if it is in a subfolder, as requested in

"I need to make a list of all these directories that is constructed in a way such that every first-level directory is listed next to the date and time of the latest created/modified file within it."

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
tomsv
  • 7,207
  • 6
  • 55
  • 88
2

@anubhava's answer is great, but unfortunately won't work on BSD tools – i.e. it won't work with the find that comes installed by default on macOS, because BSD find doesn't have the -printf operator.

So here's a variation that works with macOS + BSD (tested on my Catalina Mac), which combines BSD find with xargs and stat:

$ find . -type f -print0 \
      | xargs -0 -n1 -I{} stat -f '%Fm %N' "{}" \
      | sort -rn 

While I'm here, here's BSD command sequence I like to use, which puts the timestamp in ISO-8601 format

$ find . -type f -print0 \
    | xargs -0 -n1 -I{} \
       stat  -f '%Sm %N' -t '%Y-%m-%d %H:%M:%S' "{}" \
    | sort -rn

(note that both my answers, unlike @anubhava's, pass the filenames from find to xargs as a single argument rather than a \0 terminated list, which changes what gets piped out at the very end)

And here's the GNU version (i.e. @anubhava's answer, but in iso-8601 format):

$ gfind . -type f -printf "%T+ %p\0" | sort -zk1nr

Related q: find lacks the option -printf, now what?

dancow
  • 3,228
  • 2
  • 26
  • 28
  • I needed to get the most recently modified file name, so I then added `| head -1 | cut -d' ' -f2` to only get the filename of the latest entry, but your first command sequence put me on the right path. – GameSalutes Nov 09 '21 at 22:01
1

Quick Bash function:

# findLatestModifiedFiles(directory, [max=10, [format="%Td %Tb %TY, %TT"]])
function findLatestModifiedFiles() {
    local d="${1:-.}"
    local m="${2:-10}"
    local f="${3:-%Td %Tb %TY, %TT}"

    find "$d" -type f -printf "%T@ :$f %p\n" | sort -nr | cut -d: -f2- | head -n"$m"
}

Find the latest modified file in a directory:

findLatestModifiedFiles "/home/jason/" 1

You can also specify your own date/time format as the third argument.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Jason Larke
  • 5,289
  • 25
  • 28
1

The following returns you a string of the timestamp and the name of the file with the most recent timestamp:

find $Directory -type f -printf "%TY-%Tm-%Td-%TH-%TM-%TS %p\n" | sed -r 's/([[:digit:]]{2})\.([[:digit:]]{2,})/\1-\2/' |     sort --field-separator='-' -nrk1 -nrk2 -nrk3 -nrk4 -nrk5 -nrk6 -nrk7 | head -n 1

Resulting in an output of the form: <yy-mm-dd-hh-mm-ss.nanosec> <filename>

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
mark_infinite
  • 383
  • 1
  • 7
  • 13
1

For those, who faced

stat: unrecognized option: format

when executed the line from Heppo's answer (find $1 -type f -exec stat --format '%Y :%y %n' "{}" \; | sort -nr | cut -d: -f2- | head)

Please try the -c key to replace --format and finally the call will be:

find $1 -type f -exec stat -c '%Y :%y %n' "{}" \; | sort -nr | cut -d: -f2- | head

That worked for me inside of some Docker containers, where stat was not able to use --format option.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Eljah
  • 4,188
  • 4
  • 41
  • 85
  • 1
    The `stat` command is not properly standardized so it accepts different options on different platforms. `--format` (aka `-c`) is what Linux uses (or anything with GNU Coreutils); on e.g. MacOS you need `-f` and the supported format flags are different. I'm guessing `-c` but not `--format` might be Alpine (update: confirmed) or Busybox. – tripleee Dec 10 '20 at 18:22
  • 1
    On Linux (or generally GNU userspace) systems `find -printf` can do most of what `stat` can do without the need to invoke an external tool. – tripleee Dec 10 '20 at 18:26
1

Bash has one-liner-script solution for, how to recursively find latest modified files in multiple directories. kindly find below command with your target directories.

 ls -ltr $(find /path/dir1 /path/dir2 -type f)

and for today, grep today date or time as mentioned in below command

 (ls -ltr $(find /path/dir1 /path/dir2 -type f)) |grep -i 'Oct 24'
linux.cnf
  • 519
  • 6
  • 7
  • The first command seems to break on directories with spaces in the name. Is there a quick and easy fix to this command, or is it back to one of the ones above already posted? – Alan Jul 20 '22 at 01:02
0

This could be done with a recursive function in Bash too.

Let F be a function that displays the time of file which must be lexicographically sortable yyyy-mm-dd, etc., (OS-dependent?)

F(){ stat --format %y "$1";}                # Linux
F(){ ls -E "$1"|awk '{print$6" "$7}';}      # SunOS: maybe this could be done easier

R, the recursive function that runs through directories:

R(){ local f;for f in "$1"/*;do [ -d "$f" ]&&R $f||F "$f";done;}

And finally

for f in *;do [ -d "$f" ]&&echo `R "$f"|sort|tail -1`" $f";done
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Nahuel Fouilleul
  • 18,726
  • 2
  • 31
  • 36
0

You may give the printf ACTION of find a try

%Ak File's last access time in the format specified by k, which is either @' or a directive for the C strftime' function. The possible values for k are listed below; some of them might not be available on all systems, due to differences in `strftime' between systems.

Please find the details in @anubhava's answer

graugans
  • 240
  • 3
  • 14
0

On mac I use this

find . -type f -exec stat -f "%m %N" "{}" \; | sort -nr | perl -n -e '@a = split / /;print `ls -l $a[1]`' | vim -

if you want filter some files you can use grep with regexp i.e.

find . -type f -exec stat -f "%m %N" "{}" \; | sort -nr | grep -v -E \.class$ | perl -n -e '@a = split / /;print `ls -l $a[1]`' | vim -
Gerd
  • 2,265
  • 1
  • 27
  • 46
0

For plain ls output, use this. There is no argument list, so it can't get too long:

find . | while read FILE;do ls -d -l "$FILE";done

And niceified with cut for just the dates, times, and name:

find . | while read FILE;do ls -d -l "$FILE";done | cut --complement -d ' ' -f 1-5

EDIT: Just noticed that the current top answer sorts by modification date. That's just as easy with the second example here, since the modification date is first on each line - slap a sort onto the end:

find . | while read FILE;do ls -d -l "$FILE";done | cut --complement -d ' ' -f 1-5 | sort
Izkata
  • 8,961
  • 2
  • 40
  • 50