Inside a bash script, I set an environment variable to contain a string of 1 million characters. I do so like this:
export LG=XXXXXXX # ... 1 million X's
Immediately after this, I am able to echo it back without a problem, i.e.
echo $LG
However, any other unrelated commands that I attempt to run after this inside the script fail with the "Argument list too long" error. For example:
cat randomfile.txt
/bin/cat: Argument list too long
I have read through other posts that suggest using xargs to resolve such an issue, but I have not been successful. If I use any command other than echo then I get the "Argument list too long" error even if I don't actually use the $LG variable after I set it. Of course I would like to use the $LG variable, but the error occurs even if I do not use it after it is set.
Any tips would be greatly appreciated, thanks!
Edit:
The overall problem I am trying to solve is something like this:
I have a text file that I need to keep as small as possible (i.e. a few MBs). This text file contains a set of messages that are encapsulated inside a specific network protocol (i.e. header, length of message, the message itself). The message itself can be a string of characters with a length of 1 million or more. So to keep the original file size small, instead of having multiple copies of the large message inside the file, I use a mapping. I.e. if I see the letter A in the message field, I then use sed to find and replace A with 1 million X's. Like this:
cat file.txt | sed "s/A/$LG/g" # Replace A with 1 million X's
I will eventually be running this inside a (very slow) simulator, so I need this operation to complete in as few cycles as possible. In other words, a utility like awk that uses a loop with a trip count of 1 million to dynamically generate 1 million X's would be too slow. That is why I thought the environment variable solution would be best.