1

(Just in case this is an XY problem): I need to export a really big table (85M rows), I don't have SSH access. And neither PhpMyAdmin nor SQLYog (remote connecction) are working fine (they export just a 3% of my table, and finish the export without errors, not sure why).

So I thought executing mysqldump from PHP, the problem is I'm not sure if that will kill PHP too, because the code I've seen is waiting to mysqldump to finish.

My code will be (not tested):

ini_set('memory_limit', '-1');
set_time_limit(0);
ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);

$dbhost = 'localhost';
$dbuser = 'user';
$dbpass = 'pass';
$dbname = 'dbname';
$tableName = 'mytable';
$dumpFile = __DIR__."mydump.sql";

$command = "mysqldump -h$dbhost -u$dbuser -p$dbpass $dbname $tableName | gzip> $dumpFile";

But I'm not sure if it's better to use "exec", "shell_exec", or "system", or if I need to wait for the output and then send the file to download, or I can execute the command, finish the PHP script and then come back after 1 hour or so and download the file via FTP? (also I'm worried for the filesize, I guess will be like 5 or 7GB).
Is better to create 85 csv files with 1million rows each one?

what is the best way to do this?

Enrique
  • 4,693
  • 5
  • 51
  • 71
  • I have been doing something similar in my project and it was working fine even if the browser killed the connection. I was importing SQL file as a whole and it always completed the execution. – Dharman Jul 17 '19 at 13:08
  • Because of you are stuck in "execution time" in PHP, I don't know what time do you need but you set execution time via "set_time_limit" https://www.php.net/manual/en/function.set-time-limit.php, I prefer you web socket nd walk through whole records and export it, step by step – Ferhat BAŞ Jul 17 '19 at 13:08
  • I haven't tried before but this could be help you with execution time https://stackoverflow.com/questions/135835/limiting-the-number-of-records-from-mysqldump if use limit on mysqldump you can get whole data partly – Ferhat BAŞ Jul 17 '19 at 13:12
  • Do you actually need a dump, or can you query this table out in smaller batches that you could complete programatically? – WillardSolutions Jul 17 '19 at 13:14

2 Answers2

0

I am not an expert in Servers but as far as I know, there is not much difference among "exec", "shell_exec", or "system".

However, you need to have explicit permission in order to run these commands on a Server which depends upon the Server configuration.

In my project, I usually have to move a huge DB of around 14G often. I use exec as that's the function which is allowed by the Server. Here is the script I use to dump it:

exec('mysqldump --host={HOST} --user={USER} --password='.escapeshellarg({PASSWORD}).' {DATABASE_NAME} > DUMP_FILE_PATH', $output, $status);
if ($status) {
    //An error has occured while exporting the database
    return;
}
//go on

I am not sure what do you mean by:

Is better to create 85 csv files with 1million rows each one?

I simply do it with .sql file and then import it. The size does not make any issue when you import and export a DB using terminal or terminal commands in PHP.

If you need to open the DB in some editor to check its content or modify, then yes, your editor for sure will hang and work like s**t for such a huge file but I am not sure why would you want to do that.

Update

As far as time is concerned, it totally depends upon your internet connection but I suggest you run your php script from terminal rather than on a browser because that way there is no time limit, like browsers have 30 seconds.

If still doesnt work, you can try forcing the script to increase memory and time limit it explicitly using:

ini_set('memory_limit', '512M');
ini_set('max_execution_time', 0);
Community
  • 1
  • 1
Abhay Maurya
  • 11,819
  • 8
  • 46
  • 64
  • I cannot run from terminal, I don't have access to SSH. I don't want to wait in PHP for mysqldump, just execute it via PHP because I cannot do it via SSH. But I think I can run the command and even if the PHP script is killed the mysqldump continues the work behind. – Enrique Jul 17 '19 at 13:59
  • I didn't mean the terminal of the Server, I meant your own system's terminal. – Abhay Maurya Jul 18 '19 at 06:11
  • but the database is in my server – Enrique Jul 18 '19 at 13:25
  • you don't have access to SSH for your own server? – Abhay Maurya Jul 18 '19 at 13:31
  • Learner that's a really common thing! it's very normal on any shared hosting, even in some VPS – Enrique Jul 18 '19 at 18:23
  • I guess depends upon the plan, well you could set a remote sql host if you at least have a cpanel on your server and do it from your local terminal...best – Abhay Maurya Jul 19 '19 at 06:00
0

This code was working fine, it allows me to pass the table name and the start ID for the dump from the URL just in case is helpful to someone else

<?php

ini_set('display_errors', 1);
ini_set('display_startup_errors', 1);
error_reporting(E_ALL);


$tableName = $_GET['tablename'];
$limit = isset($_GET['limit']) ? $_GET['limit'] : '';
$start = isset($_GET['start']) ? $_GET['start'] : ''; 

$dbhost = 'localhost';
$dbuser = 'root';
$dbpass = 'pass';
$dbname = 'mydb';
$dumpFile = __DIR__."/{$tableName}_{$start}_dump.zip";

echo $dumpFile;
echo '<br>';

$startSql = $start ? 'id >'.$start : ''; 
$limitSql = $limit ? ' limit '.$limit : '';
$whereSql = $limit || $start ? '--where="1 AND '.$startSql.$limitSql.'"' : '';

$command = "mysqldump -h$dbhost -u$dbuser -p$dbpass $whereSql --skip-add-drop-table --no-create-info $dbname $tableName | gzip> $dumpFile";

echo $command;
echo '<br>';

exec($command, $output, $status);
if ($status) {
    exit('error');
}else {
    exit('success');
}
Enrique
  • 4,693
  • 5
  • 51
  • 71