Ok, so let`s presume I have 3 sql tables and a text file. Text file is in CSV format. File structure - imagine a array for each line. Value 0 timestamp; 1,4,7,10 values for tab 1; 2,5,8,11 values for tab 2; 3,6,9,12 values for tab 3. What is the best approach for inserting data after processing into db? Solution 1. foreach line read "insert into tab ... ;" which I currently use and it takes a lot of time. Solution 2. foreach line read push the sql statement into an array, and when I have all the lines processed make a string like "Insert into tab...; insert into tab...; insert into tab...;" and execute the query. File has aprox. 4000 lines and I have 7 values for each table.
Asked
Active
Viewed 274 times
0
-
1I'm assuming that you're asking for code, correct? – jrd1 Aug 10 '13 at 07:13
-
No, I am not asking for code. I am asking for the best approach. Either send the data to sql as each line is read and processed, either make an array for each with the sql statement and then make a string from all the array elements and send the query to mysql. – BogdanD Aug 10 '13 at 07:20
2 Answers
1
You should use batch inserts instead of doing one record for each statement. In your case, There should be one insert statement for each table (unless there are more records for a given table than can fit in one "mysql packet").
See: How to do a batch insert in MySQL
If you use batch inserts, you should ensure that each batch insert doesn't exceed the max_allowed_packet variable.
You may also find that indexes are you biggest bottleneck. Disabling them before and reenabling them after a large insert may help. See: How to disable index in innodb
Lastly, MySQL can have this technique applied to it, for high bandwidth insertion.
0
I think you could use MYSQL syntax
LOAD DATA INFILE '/path/to/file' INTO TABLE tbl_name

Valentin Logvinskiy
- 121
- 4