0

I'm running some code on the command line that should execute some simple code inside a double nested loop, along with some printf-ing to track my progress:

for i in {1..180}; do for j in {1..200}; do
printf "$i-$j\r";
if [[ ! -f $dir/$i/$j/file0 ]] || [[ ! -f $dir/$i/$j/file1 ]]; then
echo $j >> $i.missing;
fi; done; done

To my great surprise, I see the inner loop index $j is reaching well over 200 - I've seen it go as high as 960. This may explain why this code is running so painfully slowly. I'm not really sure what mistake I've made here - does nesting loops in bash not work the way I think it does?

RavinderSingh13
  • 130,504
  • 14
  • 57
  • 93
Empiromancer
  • 3,778
  • 1
  • 22
  • 53
  • I simply ran 2 for loops and simply printed text and it worked fine for me, are you having Control M characters in your script? Could you please check once by doing `cat -v Input_file`? – RavinderSingh13 Oct 13 '19 at 16:23

1 Answers1

1

The problem here has nothing to do with loops, but lies with the use of the \r character in printf. The print isn't clearing the line, just moving back to the beginning of it - so when you reach 1-200, the next hundred print statements won't clear out all the terminal zeros, thus 2-96 appears to be 2-960. In other words, there's nothing wrong with the code, just the way progress is being reported.

There are a number of easy fixes:

  1. Replace \r with \n, though this would no longer produce the desired behavior of updating a single line of output
  2. Clear each line before printing, printf "\r[many spaces]\r$i-$j
  3. Print the numbers with a fixed width (left padding with zeros): printf "\r%03d-%03d" $i $j
Empiromancer
  • 3,778
  • 1
  • 22
  • 53