I have this folder-strucutre, with really heavy high-quality images in each subfolder
tree -d ./
Output:
./
├── 1_2dia-pedro
├── 3dia-pedro
│ ├── 101MSDCF
│ └── 102MSDCF
├── 4dia-pedro
└── Wagner
├── 410_0601
├── 411_0701
├── 412_0801
├── 413_2101
├── 414_2801
├── 415_0802
├── 416_0902
├── 417_1502
├── 418_1602
├── 419_1702
├── 420_2502
└── 421_0103
18 directories
And, I want to compress it, just like I would do with ffmpeg, or imagemagick. e.g.,
ffmpeg -i %%F -q:v 10 processed/%%F"
mogrify -quality 3 $F.png
I'm currently think of creating a vector of the directories, using shopt, as discussed here
shopt -s globstar dotglob nullglob
printf '%q\n' **/*/
Then, create a new folder-compressed, with the same structure
mkdir folder-compressed
<<iterate the array-vector-out-of-shopt>>
Finally, compress, for each subfolder, something in the lines of
mkdir processed
for f in *.jpg;
do ffmpeg -i "$f" -q:v 1 processed/"${f%.jpg}.jpg";
done
Also, I read this question, and this procedure seems close to what I would like,
for f in mydir/**/*
do
# operations here
done
Major problem: I'm bash newbie. I know all tools needed are at my disposal!
EDIT: There is a program that, for the particular purpose of compressing images with lossless quality, gives us a a-liner, and a lot of options to this compression. The caveat: make another copy of the original folder-structure-files, because it will change them permanently in the folder-structure-files you give it.
cp -r ./<folder-structure-files> ./<folder-structure-files-copy>
image_optim -r ./<folder-structure-files-copy>
I think @m0hithreddy solution is pretty cool, though. Certainly, I will be using that logic elsewhere anytime soon.