Update It is clarified that the program that uses the file, parse_qr_resutls.ksh
, is in fact a Perl script. The code in it that uses the file has single quotes around the filename, which it shouldn't have. Further, there are better ways to do what's shown in the question's edit -- to avoid a shell, or really to use Perl's File::Slurper
or Path::Tiny::slurp
if the need is indeed to "slurp" a file.† See for example this post (and pay attention to updates). For other aspects of this question as well, also see this post.
But I'm leaving the answer as it is -- for an assumption that .ksh
is a (Korn) shell file.
That glob
will scoop up filenames fine (unless some have spaces in them, see File::Glob
below), but the &
mess up the shell later.
Escape/quote shell special characters, and String::ShellQuote is meant for that
use String::ShellQuote qw(shell_quote);
for my $file (@filter2) {
my $file_esc = shell_quote $file;
my $qtr1 = qx(./parse_qr_results.ksh "$file_esc");
...
}
Note that Shell::StringQuote
is for bash
while the program in the question that uses the filename appears to be a Korn shell script (parse_qr_results.ksh
). However, shell's special characters are mostly the same and this may be good enough. A more generic tool for escaping special characters is quotemeta
The quotes in the question protect that filename at first but then it's taken by the shell script and we don't know what goes on there, so I'd specifically quote/escape with a library and hope for the best. (I'd really rather first rename files with such names...)
Also, with such funky filenames it may be a good idea to switch to File::Glob
use File::Glob qw(:bsd_glob);
Then glob
gets replaced by bsd_glob
and there is no need to change the code. But there is more you can do with it, see docs.
I strongly recommend use strict;
and use warnings;
, for code quality, time savings, sanity, and general health :)
In the end, a better way altogether is to use a library for running external commands, what brings a lot of improvements to the whole process. For one, they can easily bypass the shell and return all output as wanted. Then the error diagnostics is usually better.
Some libraries are IPC::Run (use run
with a list, so to bypass the shell), IPC::System::Simple (see capturex
), Capture::Tiny (use system in list form)
Since your command itself is a shell script I'd still protect filenames, but avoiding the first (qx
's) shell in this way would still be helpful.
† And if there is somehow a problem with installing extra libraries, one simple way to read a file into a scalar is
my $file_content = do { local (@ARGV, $/) = $filename; <> };
if we have the name of the file to slurp. Or
my $file_content = do { local $/; <> };
for a filename given on a command line and being the first thing in @ARGV
(normally if there are no command-line options or after all have been processed and removed from @ARGV
).
Or avoid the magic of <> entirely and use explicit open my $fh, ...
to open a file and then read it with <$fh>
, after undefining the local-ized input record separator $/
as above.
Note the closing ;
which is necessary.