1

i am buiding a automatic create website system...when i am installing web for my customer, at step i am excute the my sql file i am using like this :

# MySQL with PDO_MYSQL  
            $db = new PDO("mysql:host=$mysqlHostName;dbname=$mysqlDatabaseName", $mysqlUserName, $mysqlPassword);
            $query = file_get_contents($local_sql); //Local file abc.sql
            $stmt = $db->prepare($query);

            if ($stmt->execute()){ 

My problem is i have up to 117 table need to create and many many insert sql on this file.

Everytime i buidling 2 or more than website it freeze my server and let it dead.

My question is have any solution ? can excute a big sql file like that faster the way i am using ?

Sorry if my english so bad.

Thank all !

Your Common Sense
  • 156,878
  • 40
  • 214
  • 345
  • If the SQL files have the same performance effect when ran elsewhere (e.g. command line) I don't think you can't fix it from PHP. At most, you could parse out the individual statements (something not really trivial if you want to robust) and add artificial delays, but if feels like a hack. – Álvaro González Nov 22 '16 at 09:27
  • 1
    @user3040610 this has nothing to do with indexing – e4c5 Nov 22 '16 at 09:29

1 Answers1

-2

Try to execute your queries one after another. Parse SQL file and execute queries in loop like this:

$queries = explode(';', $query);
foreach ($queries as $query) {
    $stmt = $db->prepare($query);
    $stmt->execute();
}
icoder
  • 145
  • 2
  • 5