0

I gathered a list of zip in a remote disk using find /path/ -name *.zip > ~/ziplist.txt.

~/ziplist.txt look like this :

./path/to/the/file.zip
./path/to/the/file2.zip
./path/to/the/file3.zip
./path/to/the/very/nice/file.zip

I filtered this list using grep and now that I have to correct list in a .txt file, I would like to provide it to ls x --full-time to gather their timestamp (x being the list of files).

Is it possible ?

I tried something like this : for f in tmp.txt do echo $( ls $f --full-time) done

EDIT : anothier thing I tried is : cat tmp.txt | sed- "s/\(.*\)/'\1'/" ; ls --full-time $(!!) The sed part is need since obviously there are space in file names...Otherwise, if files don't have spaces, this $(!!) works just fine.

A solution that could work I guess is also ls -R --full-time | grep "what I want" > listfile.txt but I think this will take a very long time to run. Having small steps is essential so I can check the list length for example, and because my disk access can be closed sometimes. Running find already took me one hour.

Note : I'm on windows10 and use git bash to run commands, so I can't run elaborate .sh scripts. I would prefer a single terminal entry to run it. I know, not the best configuration.

Gowachin
  • 1,251
  • 2
  • 9
  • 17

2 Answers2

1

You can embed the content of a file directly into the command.

$: ls --full-time $(<~/ziplist.txt)

In this case, do NOT quote the file insertion, as that will treat the entire content of the file as a single filename.

To read them individually, don't read lines with for loops.

while read -r file; do ls --full-time "$file"; done < ~/ziplist.txt

Slower, but lets you quote the file names to avoid issues such as embedded spaces.

Before you spend too much time on that, please read why you should not parse ls output.

c.f. stat

mapfile -t lst < ~/ziplist.txt && stat -c "%x %n" "${lst[@]}"
2021-06-18 13:44:18.351339200 -0500 dev
2021-10-21 20:50:20.649098700 -0500 sts
2021-10-21 09:11:37.398341900 -0500 file
2021-10-25 08:13:03.788650200 -0500 a b c

Note that dev is a directory here, and a b c has embedded spaces.
X, Y, and Z report access, modification, and status change, respectively, uppercase in seconds since the epoch, lowercase in human-parsed date/time with fractional seconds and timezone info.

Paul Hodges
  • 13,382
  • 1
  • 17
  • 36
  • I just edited the question, I achieve the same result using `$(!!)` after `cat ziplist.txt`. But now the issue is about space in file names. ^^' – Gowachin Oct 22 '21 at 12:43
  • stat is the very thing I need, thanks ! I don't think I have parse issue because all files have been created with windows and filenames are more restrictive. Just need to fix this space issue tho... – Gowachin Oct 22 '21 at 12:56
  • 1
    Windows allows spaces, and other things can get in that you wouldn't expect, but prefer `stat` or `find`. ;) – Paul Hodges Oct 22 '21 at 13:00
  • Accept the solution if it works for you? Always good policy! – Paul Hodges Oct 22 '21 at 13:01
  • It's not yet the solution ^^' I have spaces in my files. I'll accept it tho, but a little help to finish it would be perfect. – Gowachin Oct 22 '21 at 13:04
0

The answer from @Paul Hodges works fine if there is no spaces. But this was my case so here are some informations.

First step, I find all zip files where they are and save their locations in a txt file: find /disk/Path/ -name *.zip > ~/ziplist.txt

I then grep in this file to find those relative to my project using grep grep keyword ziplist.txt > mylist.txt I use wc -l mylist.txt ; cat mylist.txt to check this list.

Now I want to fond some time information about this file list. Since there is spaces in it, file path are broken. First I need to modify the internal field separator (IFS), with the code found here :

OLD_IFS=$IFS
IFS=$'\n'

As I am on windows, the end of line (EOL) is not compatible with the new IFS. So I need to replace this with this help.

dos2unix mylist.txt

Now I can use stat on the mylist.txt.

stat $(cat /Disk/Path/mylist.txt) --format %w%z > ~timefile.txt
Gowachin
  • 1,251
  • 2
  • 9
  • 17