i have a perl script where i am submitting a hadoop command to run in background, in the while loop for each record in a another input file. My scenario is , i have to fire the hadoop commands to unix and make it run background, DO NOT WAIT for the background process to finish and go to the next record in the input file and fire the next one. The code which is below submits the hadoop command in background , but it waits for it complete which i dont want to.
open(my $data, '<', $file) or die "Could not open '$file' $!\n";
while (my $line = <$data>) {
chomp $line;
my @fields = split "," , $line;
if( $fields[7] == 'Y' ) {
`nohup sqoop export
--connect
"jdbc:sqlserver://sqlserver:1433;database=$fields[0];user=sa;password=pwd"
--table $fields[2]
--export-dir $src_dir
--input-fields-terminated-by '$fields[3]'
--input-lines-terminated-by '$fields[4]'
--m $fields[5]
--staging-table $fields[6]
--clear-staging-table
--batch > $tgt_dir/$fields[2].out & `;
echo $! > $pid_file;
}
else {
next;
}
}
Please let me know how can i do it..
Also,The hadoop command can run for more than few minutes. I have to make the script wait for background commands ,with the help of pid, which were fired and give a report based on the STDOUT files by hadoop commands.
Need help on how to accomplish this....