Fairly new to Perl. I have a Perl script on a Linux machine, which has own logfile. Logfile name can change, dependent on data the script is working on (date, filename, datatype, etc.)
The script at some pionts is calling a native executable with system() call, which gives some information out to STDOUT and STDERR - few tens to few hundreds lines over many minutes. After the executable is done, the script continues and logs some other info to the logfile.
Until now the script only logged its own output, without the native executables output, which I want to log in same files as the Perl script logs to. Tried it with following two methods:
#!/usr/bin/perl
#some other code
@array_executable_and_parameters = qw/echo foo/ ;
open $log_fh, '>>', 'log/logfile1.txt';
*STDOUT = $log_fh;
*STDERR = $log_fh;
print "log_fh=$log_fh\n";
system( @array_executable_and_parameters);
$logfilename='log/logfile2.txt';
open(LOGFILEHANDLE, ">>$logfilename" );
*STDOUT = LOGFILEHANDLE;
*STDERR = LOGFILEHANDLE;
print LOGFILEHANDLE "Somethinglogged\n";
system( @array_executable_and_parameters);
It works when I run the script manually, but not when run from cron.
I know it is possible to redirect in the crontab by Linux means, but then I have to know the filename to log to, which only will be known when some data arrives, so seems to me not feasible. I also would like to have as much as possible inside the script, without many dependencies on the Linux etc. I have also no possibility to install any extra modules, libraries for Perl to use, suppose it is bare minimum install.
How do I get STDOUT and STDERR redirected to a specific file from inside the Perl script?
And if possible, how do I detect filename the STDOUT currently goes to?