-1

I have a .csv that follows the pattern "col1","col2","col3" and the first row has headers. I'm trying to use sed to add some additional columns like "col4" where col4 come from a variable.

For example, If I tried to add a column with the header "Athlete" and the rest of the rows are populated by the $NAME variable.

sed -i '1s/$/',\"Athlete\"'/; 2,$s/$/',\"$NAME\"'/' workouts.csv 

Right now I end up with

"col","col2","col3","Athlete"
"data","data","data","$NAME"

I'd hope for something like

"col","col2","col3","Athlete"
"data","data","data","Michael"

I've tried surrounding everything in double quotes and tried doubling the double quotes around $NAME.

I've also tried changing the s/regex/replacement/ delimiter to # to make it look cleaner.

I've read through man sed a few times now.

I've also pulled out a few hairs.

  • 2
    [Your code works](https://ideone.com/53b8NE). `sed -i '1s/$/,"Athlete"/; 2,$s/$/,"'$NAME'"/' workouts.csv` should work, too – Wiktor Stribiżew Sep 11 '19 at 20:19
  • @WiktorStribiżew very interesting. when I would run either of your solutions, I get the sed: unterminated `s' command error. I wish I had the background to understand what's different from my environment and what you linked. I'm going to take a stab at it, but Barmar's solution below worked well for me. Thank you, though! – Michael McNeil Sep 11 '19 at 20:53

1 Answers1

0

The variable needs to be inside double quotes, not single quotes.

Use -e to provide multiple commands, so you can quote them differently. The first command doesn't have any variables, so it can be in single quotes, and you don't need to escape double quotes inside single quotes. The second command has the variable, so it needs to be in double quotes, with escaping.

sed -i -e '1s/$/,"Athlete"/' -e "2,\$s/\$/,\"$NAME\"/" workouts.csv 
Barmar
  • 741,623
  • 53
  • 500
  • 612
  • Thank you! When I had tried surrounding the entire thing in double quotes before, I was failing to escape the '$'. And quoting them differently definitely helps me see what I was missing. Again....thank you so much. – Michael McNeil Sep 11 '19 at 20:46