I have a directory with 100 of xlxs files. Now what I want to do is to convert all these files into PDF all at one time or some at one time. The conversion process is working fine at the moment with foreach and cron. But it can process or convert files one at a time which increase waiting time at the user end who is waiting for PDF files.
I am thinking about parallel processing at this time but don't know how to implement this.
Here is my current code
$files = glob("/var/www/html/conversions/xlxs_files/*");
if(!empty($files)){
$now = time();
$i = 1;
foreach ($files as $file) {
if (is_file($file) && $i <= 8) {
echo $i.'-----'.basename($file).'----'.date('m/d/Y H:i:s',@filemtime($file));
echo '<br>';
$path_parts = pathinfo(basename($file));
$xlsx_file_name = basename($file);
$pdf_file_name = $path_parts['filename'].'.pdf';
echo '<br>';
try{
$result = ConvertApi::convert('pdf', ['File' => $common_path.'xlxs_files/'.$xlsx_file_name],'xlsx');
echo $log = 'conversion start for '.basename($file).' on '. date('d-M-Y h:i:s');
echo '<br>';
$result->getFile()->save($common_path.'pdf_files/'.$pdf_file_name);
echo $log = 'conversion start for '.basename($file).' on '. date('d-M-Y h:i:s');
echo '<br>';
mail('amit.webethics@gmail.com','test','test');
unlink($common_path.'xlxs_files/'.$xlsx_file_name);
}catch(Exception $e){
$log_file_data = createAlogFile();
$log = 'There is an error with your file .'. $xlsx_file_name.' -- '.$e->getMessage();
file_put_contents($log_file_data, $log . "\n", FILE_APPEND);
continue;
}
$i++;
}
}
}else{
echo 'nothing to process';
}
Any help will be highly appreciated. Thanks