1

On my ZF project, I'm importing data from a CSV file and after some treatement, I insert the data in my MySQL database with a Zend_Db_Table. Here's what the code looks like:

private function addPerson($data)
{
    $personDao = new Person();
    $personRow = $personDao ->createRow();

    if($newperson == -1) 
    {
        //already in DB
    }
    else
    {
        $personRow->setName($data['name']);
        ...
        $personRow->save();
    }
}

It's working just fine. My only concern is the time it'll take for thousands of rows to be inserted using this way. So my question is: Is there anyway I can improve my code for large files? Can I still use the save() function for a lot of rows (>6000) ? Any suggestion will be welcome.

I was wondering if there's a ZEND function that can buffer like 500 rows and insert them in one shot instead of using save() on each row. I'm already at 1min for 6000 rows...

  • Your $newperson is not initialized anywhere – aderuwe Mar 25 '14 at 12:20
  • Yeah it's a return value of a function I call that fetch the table to see if the id is in there. Sorry, didn't put the whole code, but as said, it's working just fine. – user3415419 Mar 25 '14 at 12:24

1 Answers1

0

I think to optimize the integration of csv file, you must transfer the work to MySQL. Either stored or through PHP command line procedure.
It will be create a new file for your tables.
You will find ideas in Import CSV to mysql table.

I did not do, but I think it is quite feasible.

I hope it will help you :)

Community
  • 1
  • 1
doydoy44
  • 5,720
  • 4
  • 29
  • 45