(Just in case this is an XY problem): I need to export a really big table (85M rows), I don't have SSH access. And neither PhpMyAdmin nor SQLYog (remote connecction) are working fine (they export just a 3% of my table, and finish the export without errors, not sure why).
So I thought executing mysqldump from PHP, the problem is I'm not sure if that will kill PHP too, because the code I've seen is waiting to mysqldump to finish.
My code will be (not tested):
ini_set('memory_limit', '-1');
set_time_limit(0);
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);
$dbhost = 'localhost';
$dbuser = 'user';
$dbpass = 'pass';
$dbname = 'dbname';
$tableName = 'mytable';
$dumpFile = __DIR__."mydump.sql";
$command = "mysqldump -h$dbhost -u$dbuser -p$dbpass $dbname $tableName | gzip> $dumpFile";
But I'm not sure if it's better to use "exec", "shell_exec", or "system", or if I need to wait for the output and then send the file to download, or I can execute the command, finish the PHP script and then come back after 1 hour or so and download the file via FTP? (also I'm worried for the filesize, I guess will be like 5 or 7GB).
Is better to create 85 csv files with 1million rows each one?
what is the best way to do this?