This typical Bash scripting idiom for assigning a multi-line string literal to a varibale:
read -r -d '' FILE_CONTENTS <<'EOF'
... contents ...
EOF
has been serving me just fine everywhere I used it.
I run a script containing this idiom as root
today, and all hell broke lose. Files would become corrupt, all sorts of weird stuff falling from the sky. I vaporised my Mac. BTW, this mistake has destroyed... everything basically. If I could upvote myself for being nuclear, I would. Effectiveness in self-destruction.
(source: iliketowastemytime.com)
Let me provide you with a minimal working config to reproduct it:
# Recreating a .vimrc file, for example
read -r -d '' FILE_CONTENTS <<'EOF'
let mapleader = "<space>"
let maplocalleader = "\\"
EOF
echo "${FILE_CONTENTS}" > .vimrc
This works just fine as user tinosino
but... as root... anything with a backslash is problematic.
Why? Where to look for "the difference"? Options? Environment? Aliases on "echo" (?!) I am lost. Tried bash -x
too: pages and pages of failure on the script I am testing. What could have changed what, in the setup for root? Where?
OS X version: 10.8.2
Bash version: 3.2.48(1)-release
I have a sh
Vs bash
guard in the script but that did not work... was it badly constructed?
if [ ${BASH_VERSION-not_running_within_bash} = not_running_within_bash ]; then
echo This script is meant to be executed by the Bash shell but it isn\'t.
echo Continuing from another shell may lead to unpredictable results.
echo Execution will be aborted... now.
return
fi
It stops zsh
but doesn't stop sh
from running the script... how could I do that?
EDIT: I resorted to change the script guard to this:
if [[ ! $(ps -p $$ -o comm | paste -s - | awk '{ print $NF }') == *bash* ]]; then