0

I have a temp table (a dump from older system, with one long row of many details) with about 8k records, I need to break it down, use some logic and insert the data into 3 different tables.

table_temp (id, name, address, email, phone, mobile, etc......)
tbluser (id, username, password, roleid, status)
tblstudent (id, userid, name, address, classid, sectionid, etc.....)
tblstudentdetails(id, studentid, address, contact details....)

Since I need to get the data, put some logic and create separate insert statements to all these tables. I have tried to create large insert statements like:

INSERT INTO `tbluser`(`instsessassocid`, `username`, `password`, `roleid`, `status`) VALUES 
('1','PRAGUN10000@temp.com','$2y10$gFscDWwWKR8Iven6R','3','1'),
('1','SONAM10001@temp.com','$2y10$gFscDWwWKR8Iven6R','3','1'), 
('1','SONAM10001@temp.com','$2y10$gFscDWwWKR8Iven6R','3','1')

And the same for other tables (you get the idea):

INSERT INTO `tblstudent`(`scholarnumber`,`firstname`, `middlename`,`lastname`) VALUES
('10000','PRAGUN','','TANWAR'),('10001','SONAM','','-'),
('10002','HITESH','','KUMAR'),('10003','KHUSHI','','MEHTA'),
('10004','ADITYA','','VASYANI')

After creating 4 large insert statements, I store them in an array and send it to my insert function, which put off auto-commit, takes each insert statement and once all is ok, commits and should return the insert ids etc.

My app seems to time-out (currently set to default/30 sec), surely such a small insert shouldn't time out, can anyone point out what I am doing wrong or how I can improve.

This is my insert function:

function dbInsert($sql) {
    if (is_array($sql) == 0) {
       $sqlarray[] = $sql;
        } else {
            $sqlarray = $sql;
        }
        $sqlCount = count($sqlarray); //echoThis($sqlCount); die;
        $con = dbConnect();
        $insertID = array();
        try {
            // begin a transaction
            $con->autocommit(FALSE);
            /* commit transaction */

            foreach ($sqlarray as $value) {

                if ($con->query($value)) {
                    $insertID[] = $con->insert_id;
                } else {
                    trigger_error(mysqli_error($con));
                }
            }
            // if no error, commit.
            if ((!mysqli_error($con)) || (!mysqli_commit($con)) && ($sqlCount === count($insertID))) {
                $con->commit(); //mysqli_commit($con);
            } else {
                $con->rollback();
                trigger_error("Error in dbInsert: " . mysqli_error_list($con));
                $con->close();
            }
        } catch (Exception $e) {
            // if any error, catch the exception and rollback 
            $con->rollback();
            trigger_error("Error in dbInsert:" . $e);
        }
        /* close connection and return the result */
        $con->close();
        return $insertID;
    }

This is just a test, once sorted I would need to import quite a few records so I want to have "ready scripts" which I can use as needed.

I am aware of load infile or mysqlimport ways but I need the logic and I really think my scripts should be able to do the job. Mind you I am on my localhost, with nothing else happening with a reasonable configuration machine.

Thank you!

Anx
  • 141
  • 3
  • 12
  • Are you using MyIsam tables or Innodb? Innodb supports transactions but is rather slow on import. If you want to calculate the time of execution, you could try importing a small number of lines and have grab the timestamp before and after the operation. Once that's done, it's a small matter to calculate how long your script's gonna take – Osuwariboy Jul 01 '15 at 14:17
  • @Osuwariboy, I am using innodb. Yes, I can break it apart into smaller scripts. However, should about 3k records (which is what one sets of insert becomes) should timeout? Is there anything grossly wrong that I am doing or its just the way it is? Thanks heaps! – Anx Jul 01 '15 at 14:48

1 Answers1

0

There doesn't seem to be anything inherently wrong with what you're doing. However, from what you told me, the data is already in a table in one single row. So maybe you could have MySQL do the logic for you instead of PHP. Here's what I mean:

1) Add as many fields as you need to your first temporary table so you'll have enough to receive the information you want

2) Here's a solution that'll let you split a string with a given delimiter:

Split value from one field to two

What you do here is a query like this

UPDATE myTable
SET newField1 = SUBSTRING_INDEX(SUBSTRING_INDEX(membername, ' ', 1), ' ', -1),
newField2 = SUBSTRING_INDEX(SUBSTRING_INDEX(membername, ' ', 1), ' ', -1),
....

In the end, what you want is your data split into all its various fields instead of just one big field.

4) At the end, you can just do:

INSERT INTO tbluser 
SELECT field1, field2, field3 FROM myTable

Once the data is in their proper tables, you can just run a series of update that'll fix whatever logic is necessary to make the data fit for production. I don't know if this is going to be faster than doing bulk inserts, but I think it's worth a try.

Community
  • 1
  • 1
Osuwariboy
  • 1,335
  • 1
  • 14
  • 29
  • thank you. Actually, I need to split the data, get the new ids from different tables (via select statements) and insert accordingly in 3 table. I noticed that I can insert about 456 records and then it times out. I did try to write the data on to a csv file and do an mysqlimport. But it would be nice to be able to do this via php+mysql as I would need to import quite a bit of data going forward. I am surprised that its not able to handle this but I am at loss to know how to improve on this. Interestingly though, ms access seems to do much faster job. Thank you for you help though! – Anx Jul 03 '15 at 05:38
  • 1
    You know, if the tables are all temporary, you could just create them as MyIsam and then do an alter table to convert them to Innodb. Also, Innodb slows down considerably when the tables have indexed fields. Maybe you could try creating your tables and putting the indexes once the data is loaded. – Osuwariboy Jul 06 '15 at 13:57
  • Thank you @Osuwariboy, I will try that. For now, however, I restricted the recordset to about 500 each time; it look longer but I was able to import all data as needed. Thanks again! – Anx Jul 08 '15 at 04:56