0

So my situation is the following. I obtain data from a certain source through PHP every 24 hours. Now during that time this data gets changed, new data is added or some is updated. Now when I run a query to insert or update this data each 24 hours it takes a long time for the query to execute. This would not be a problem if the execution time was moderate as I am planning on making this a cron job, however the execution time now is way too high. So I was thinking about writing this data from PHP to a CSV file, when I obtain it. And then using this CSV file together with the MySQL LOAD DATA function which supposedly inserts a lot of data from a file in a very quick manner.

So the question is it possible to write to a CSV file from PHP and have it formatted in a way that suits the LOAD DATA INFILE function, and how can I each 24 hours delete that CSV and create a new one with newly inserted data, and how would I go about properly using the LOAD DATA INFILE function with this particular CSV file? Oh and can I make a cron job out of all of this ? Thanks.

Ahmad Azwar Anas
  • 1,289
  • 13
  • 22
Emcho
  • 69
  • 1
  • 8
  • Have you tried using batch inserts? Multiple inserts in one query is faster then one insert in each query. See [http://stackoverflow.com/a/780046/1057527](http://stackoverflow.com/a/780046/1057527) – machineaddict May 30 '13 at 08:20
  • This is the correct procedure, actually you described the right way to complete this task. So short answer is: Yes. – Bud Damyanov May 30 '13 at 08:20
  • Batch inserts won't help me here. – Emcho May 30 '13 at 08:23
  • Do I need to make a cron job which makes a csv file out of this new data every 24 hours and then make a mysql procedure which uses LOAD DATA to insert this csv into the table, or can I somehow combine these two into one cron job. – Emcho May 30 '13 at 08:25
  • Why do batch inserts don't help you? How many rows do you have to insert? I have tried batch inserts and its at least 6 times faster (at least in my case) with over 100k records. They are under a few seconds. Also, I'm using innoDB with foreign keys, which is slower. – machineaddict May 30 '13 at 08:29

2 Answers2

0

As you can read LOAD DATA statement its very flexible , you can specify the fields,rows delimiter and many other features. This will allow you to make your csv in any format do you want.LOAD DATA is indeed the fastest way to insert data in mysql.

In php you can very simply write data to a file like its shown here.

After this indeed you need a cron job that will load the file in mysql, this should be helpful.

Stephan
  • 8,000
  • 3
  • 36
  • 42
0

assume you receive data from your source and prepare this array:

$x=array(1=>array('1','2','3','4'), 2=>array('5','6','7','8'));

create a csv file like this:

    $file=fopen(<DIRECTORY_PATH>."file.csv","w");
    if(!$file){
       //error
    }

    $csv_data="";

    foreach($x as $row){
     foreach($row as $element){
        $csv_data.=$element.", ";
     }
    //remove the last comma
    $csv_data.="\n";

    }

    fwrite($file,$csv_data);

$query="load data infile '".<DIRECTORY_PATH>."file.csv"."' into table your_table";
if(!mysqli->query($query))
 printf("Error: %s\n", $mysqli->error);
Bere
  • 1,627
  • 2
  • 16
  • 22