1

I have a script that prints my volume status. It checks the output of pactl subscribe to determine when something has changed. Currently I'm doing this with a while loop, and after the script has been running for a certain period of time (I can replicate quickly by holding a key to toggle mute for about a minute), the only output is "/usr/bin/grep: Argument list too long"

I've tried using < <(pactl subscribe), piping into the while loop, and also reading from a fifo. None of these work. Is this expected? If so, what would be the way to handle something like pactl subscribe that prints infinite output? Since the first error mentioned ponymix, I thought it might be an issue there, but using pamixer instead fixes nothing either.

The full script is here. Here is a relevant excerpt:

while read -r event; do
    if echo "$event" | grep --quiet --invert-match --ignore-case "client"; then
        print_volume
    fi
done < <(pactl subscribe)

I expect no errors. The first error is line 36: /usr/bin/ponymix: Argument list too long. The second error is line 36: /usr/bin/grep: Argument list too long. Then afterwards all output is line 88: /usr/bin/grep: Argument list too long.

Edit: This is not the same issue as the suggested duplicate caused by passing a long argument list to something. I am not using globbing like in that example.

noctuid
  • 171
  • 6
  • This might help: [How to debug a bash script?](http://unix.stackexchange.com/q/155551/74329) – Cyrus Jul 18 '19 at 02:05
  • Also `subscribe - ... pactl does not exit by itself, but keeps waiting for new events.` – David C. Rankin Jul 18 '19 at 02:06
  • The excerpt given can't possibly generate that error; it has no non-builtin commands with parameterized arguments. If something it calls is a function and *that function* calls external commands with parameterized arguments, it might be different – Charles Duffy Jul 18 '19 at 04:04
  • 1
    ...well, let me back up: The excerpt can't generate the error *unless you export more content to the environment than there's room for it to hold*. Command-line arguments and environment variables share the same space, so `export` too much and you can no longer run new programs with even *short* argument lists. – Charles Duffy Jul 18 '19 at 04:05
  • ...generally speaking, when you want to set a variable, you should keep them as regular shell variables, not make them environment variables, unless you know exactly why you're doing otherwise. – Charles Duffy Jul 18 '19 at 04:06
  • 3
    Frankly, the code itself looks fine; I'd worry about whether you exported too much into the environment somewhere *before* it was even started. Either way, before we can start to debug this, we need a [mcve] someone else can run to generate the problem themselves, *included in the question*. – Charles Duffy Jul 18 '19 at 04:07
  • BTW, if you want to check your current environment size, `wc -c – Charles Duffy Jul 18 '19 at 04:09
  • Possible duplicate of [Argument list too long error for rm, cp, mv commands](https://stackoverflow.com/q/11289551/608639) – jww Jul 18 '19 at 05:52
  • 1
    @jww, I don't think this is a fit; in that case they're explicitly passing an argument list long enough to cause the problem; in this case, the argument lists aren't dynamic at all (which also means that splitting them up with `xargs` isn't a feasible solution). – Charles Duffy Jul 18 '19 at 11:19
  • @noctuid, ...since you're back around and answering questions (or at least, editing to address the duplicate we both agree is false), what's your result wrt. environment space usage in the context where the script fails? – Charles Duffy Jul 19 '19 at 00:27
  • @noctuid, ...btw, if `wc` itself fails, you can measure that with native bash, like so: `count=0; while IFS= read -r -d '' envvar; do count=$(( count + ${#envvar} + 1 )); done &2`. It's not perfect -- that measures what usage *was* when your script started, vs what it *is* right now -- but it's a start. – Charles Duffy Jul 19 '19 at 00:29
  • To get a more up-to-date result (but that requires more manual interpretation), you could also run `declare -p`, and search through the results only for the entries that start with `declare -x` (`declare -- foo="bar"` is not exported, and doesn't use environment space). – Charles Duffy Jul 19 '19 at 00:31
  • You were right; the environment size was too large. I was repeatedly sourcing a file with exports in the while loop. I had no idea this could be an issue. Thanks so much for the help! – noctuid Jul 19 '19 at 00:34

1 Answers1

0

The issue is that inside the print_volume function, I was repeatedly sourcing a file with exports in it. As pointed out by Charles Duffy, this caused the environment size to be too large.

noctuid
  • 171
  • 6
  • 1
    That does explain it! If you don't need to read those variables from a subprocess you could just take out the `export` directive, and thus define them as regular shell variables instead of environment variables. – Charles Duffy Jul 19 '19 at 00:39
  • The file I'm sourcing is autogenerated and used in other scripts as well. I've just started sourcing it once at the top of the script, which fixes the issue. Thanks again for the help. – noctuid Jul 19 '19 at 00:42