24

I have a varsValues.txt file

cat varsValues.txt
aa=13.7
something=20.6
countries=205
world=1
languages=2014
people=7.2
oceans=3.4

And I would like to create 2 arrays, vars and values. It should contain

echo ${vars[@]}
aa something countries world languages people oceans

echo ${values[@]}
13.7 20.6 205 1 2014 7.2 3.4

I use

Npars=7

readarray -t vars < <(cut -d '=' -f1 varsValues.txt)
readarray -t values < <(cut -d '=' -f2 varsValues.txt)

for (( yy=0; yy<$Npars; yy++ )); do
eval ${vars[$yy]}=${values[$yy]}
done

echo $people
7.2

But I would like it without readarray which does not work on Mac (os x) and IFS (interfield separater).

Any other solution? awk? perl? which I can use in my bash script.

Thanks.

Paul R
  • 208,748
  • 37
  • 389
  • 560
user3652962
  • 453
  • 1
  • 4
  • 13
  • 1
    What are you trying to do? Post some expected output. Given your input, there's an excellent chance you should just be writing an awk script, not a shell script, but we can't help you with that until we know what it's supposed to do. – Ed Morton May 24 '14 at 14:27

8 Answers8

18

You could use a read loop.

while IFS=\= read var value; do
    vars+=($var)
    values+=($value)
done < VarsValues.txt
John B
  • 3,566
  • 1
  • 16
  • 20
  • 1
    +1 This would be preferred even if `readarray` *were* available, as it avoids external processes and a second pass through the file. – chepner May 24 '14 at 14:50
  • In my implementation the code works without values+=($value) to create an array, where array items can be listed with ${vars[@]}. – amc Oct 15 '19 at 20:02
  • 1
    This solution works perfectly, but pay attention to (the absence of) whitespace in `vars+=($var)`. Using `vars += ($var)` does *not* work, which surprised me a bit. – blubb Mar 11 '20 at 11:26
7

Here's the awk version. Note that NPars is not hardcoded.

vars=($(awk -F= '{print $1}' varsValues.txt))
values=($(awk -F= '{print $2}' varsValues.txt))

Npars=${#vars[@]}

for ((i=0; i<$Npars; i++)); do
    eval ${vars[$i]}=${values[$i]}
done

echo $people
S. Ahn
  • 633
  • 5
  • 8
6

You can use extra brackets:

vars=( $(cut -d '=' -f1 varsValues.txt) )
values=( $(cut -d '=' -f2 varsValues.txt) )

You can prefix with declare -a but as commenters have pointed out it's superfluous.

declare -a vars=( $(cut -d '=' -f1 varsValues.txt) )
declare -a values=( $(cut -d '=' -f2 varsValues.txt) )
malhal
  • 26,330
  • 7
  • 115
  • 133
anubhava
  • 761,203
  • 64
  • 569
  • 643
4

Try:

IFS=$'\n' vars=($(cut -d '=' -f1 varsValues.txt))
IFS=$'\n' values=($(cut -d '=' -f2 varsValues.txt))
Noel Yap
  • 18,822
  • 21
  • 92
  • 144
4

Mac uses an outdated version of bash (due to licencing reasons) by default which is lacking the readarray command.

This solution worked best for me (Mac user):

Check version of bash (probably version 3 from 2007)

bash --version

Download latest version of bash

brew install bash

Open a new terminal (which will load the new environment), then check the new version of bash (should be version 5 or higher)

bash --version

Check location(s) of bash

which -a bash

Output:

/usr/local/bin/bash
/bin/bash

You can see that you now have two versions of bash. Usually, both of these paths are in your PATH variable.

Check PATH variable

echo $PATH

The /usr/local/bin/bash should be standing before the /bin/bash in this variable. The shell is searching for executables in the order of occurrence in the PATH variable and takes the first one it finds.

Make sure to use a bash shell (rather than zsh) when using this command.

Try out the readarray command by e.g. redirecting the output of the ls command with command substitution to the readarray command to generate an array containing a list with the filenames of the current folder:

readarray var < <(ls); echo ${var[@]}

Also, if you want to write a bash script make sure to use the correct Shebang:

#!/usr/local/bin/bash
Amun_Re
  • 136
  • 6
2
perl -0777 -nE '@F= split /[=\r\n]/; say "@F[grep !($_%2), 0..$#F]"; say "@F[grep $_%2, 0..$#F]"' varsValues.txt

or by reading same file twice,

perl -F'=' -lane 'print $F[0]' varsValues.txt
perl -F'=' -lane 'print $F[1]' varsValues.txt
mpapec
  • 50,217
  • 8
  • 67
  • 127
  • And then what? These doesn't populate shell variables as the sample script does. – msw May 24 '14 at 07:40
2

Let's start with this:

$ awk -F'=' '{values[$1]=$2} END{print values["people"]}' file
7.2

$ awk -F'=' '{values[$1]=$2} END{for (name in values) print name, values[name]}' file
languages 2014
oceans 3.4
world 1
something 20.6
countries 205
people 7.2
aa 13.7

Now - what else do you need to do?

Ed Morton
  • 188,023
  • 17
  • 78
  • 185
1

Figured I'd toss this in here: https://raw.githubusercontent.com/AdrianTP/new-environment-setup/master/utils/readarray.sh

#!/bin/bash
# from: https://peniwize.wordpress.com/2011/04/09/how-to-read-all-lines-of-a-file-into-a-bash-array/
readarray() {
  local __resultvar=$1
  declare -a __local_array
  let i=0
  while IFS=$'\n' read -r line_data; do
      __local_array[i]=${line_data}
      ((++i))
  done < $2
  if [[ "$__resultvar" ]]; then
    eval $__resultvar="'${__local_array[@]}'"
  else
    echo "${__local_array[@]}"
  fi
}

I keep this in a "utils" folder in my "new-environment-setup" Github repo, and I just clone it down and import it whenever I need to read a file into an array of lines an array get a new computer or wipe my drive. It should thus act as a backfill for readarray's shortcomings on Mac.

Import looks like:

# From: https://stackoverflow.com/a/12694189/771948
DIR="${BASH_SOURCE%/*}"
if [[ ! -d "$DIR" ]]; then DIR="$PWD"; fi
. "$DIR/utils/readarray.sh"

Usage looks like readarray "<output_var_name>" "<input_file_name>".

Yes it's a little rough. Sorry about that. It may not even work correctly anymore, but it did at one point, so I thought I would share it here to plant the idea of simply...writing your own backfill.

Adrian
  • 727
  • 1
  • 9
  • 17