1

I'm working on a bash script for my server backup. Based on my web root(/home), I wanna filter my web directories excludes something generals. I found --ignore option for it. Here's my code for returng what I want.

DIR_LIST=`ls -al $WWW_ROOT --ignore={.,..,ubuntu,test} | grep "^d" | awk '{ print $9 }'`
echo $DIR_LIST;

But when I tried with array, it's not worked as well.

EXCLUDED=(. .. test ubuntu)
STR=$(IFS=,; echo "${EXCLUDED[*]}")
DIR_LIST=`ls -al $WWW_ROOT --ignore={$STR} | grep "^d" | awk '{ print $9 }'`
echo $DIR_LIST;

echo $STR works well but echo $DIR_LIST is not. I think brace is not worked properly.

How can I do this as I expected?

Meow Kim
  • 445
  • 4
  • 14

3 Answers3

2

Your idea is mostly right, but it has a few issues. Firstly the ls command uses the shell globbing style to expand your --ignore pattern. It works fine when you specified it explicitly.

But when you define it in a variable to expand the glob, the shell doesn't like it. Because when the shell expands the command line, it does the glob expansion first before expanding the variable. Since you have the glob within the variable, by the time it expands the variable, it has crossed a point where it can expand the glob, so your --ignore pattern is never really formed with the way you expect it to be.

Also, the way you are populating the array, its not recommended to take that approach, because it is fragile ( Why you shouldn't parse the output of ls(1) ), when it comes to dealing with names involving spaces/newline or any other shell metacharacters.

See How to exclude a directory in find . command to exclude directories when searching over a file system.

Inian
  • 80,270
  • 14
  • 142
  • 161
2

Here is one way of doing it without using ls and to make matters worst you're using the -al flag.

#!/usr/bin/env bash

shopt -s nullglob extglob

files=(/path/to/www/directory/!(ubuntu|test)/)

declare -p files

That will show you the files in the array assignment.

If you want to loop through the files and remove the pathname from the file name without using any external commands from the shell.

for f in "${files[@]}"; do echo "${f##*/}"; done 

Which has the same result when using basename

for f in "${files[@]}"; do var=$(basename "$f"); echo "$var"; done 

Or just do it in the array

printf '%s\n' "${files[@]##*/}"

The "${files##*/}" is a form of P.E. parameter expansion.

There is an online bash manual where you can look up P.E. see Parameter Expansion

Or the man page. see PAGER='less +/^[[:space:]]*parameter\ expansion' man bash

Look up nullglob and extglob see shell globbing

The array named files now has the data/files that you're interested in.

By default the dotfiles is not listed so you don't have to worry about it, unless dotglob is enabled which is off by default.

Jetchisel
  • 7,493
  • 2
  • 19
  • 18
  • 1
    Thank you for your suggestion. Is there a way to remove /path/to/www/directory/ from result? I need only base directory name for my job. – Meow Kim Feb 17 '20 at 06:06
  • You mean remove the pathname from the filename? You ca use a parameter expansion `"${files##*/}"`, `for f in "${files[@]}"; do echo "${f##*/}"; done` – Jetchisel Feb 17 '20 at 06:21
-1

if "ubuntu" and "test" is file, it should work. if they are directories, you need add dot "." , like ".test" or ".ubuntu" in EXCLUDED

The_flash
  • 184
  • 1
  • 1
  • 11