10

I want to insert some 4K rows in the MySql db. I don't want to fire 4k 'insert' queries. Is there any way by which I can fire only one insert query to store those 4k rows in the db.

I searched on internet and everywhere I found that the users are doing bulk insert into the db from a file.

In my case, I have the data in the memory and I don't want to first write that data to a file to do bulk insert. If I do that then I will add delay in the program.

Amit
  • 33,847
  • 91
  • 226
  • 299

5 Answers5

16

You could write a single insert query that would do several inserts in a single call to the database :

insert into your_table (field1, field2, field3)
values 
  (value1_1, value1_2, value1_3), 
  (value2_1, value2_2, value2_3), 
  (value3_1, value3_2, value3_3)


Here, with the example I've given, this single query would have inserted three rows in the table.

Pascal MARTIN
  • 395,085
  • 80
  • 655
  • 663
  • 2
    I think that the db put some constraints over the length of the query. In case of 3 rows, it will work but in case of 4K rows I don't think that it will work – Amit Apr 03 '11 at 12:35
  • This worked for me when I tried inserting 1000 rows (8 x int, 2 x string). But when I tried 2000 rows, I got a "MySQL server has gone away" error. To fix this, I will try to insert 1000 rows at a time like James C suggests. – Nick Bolton May 30 '12 at 22:54
  • @Pascal Is this atomic? Is it possible that if the db crashes halfway, only two out of three rows are inserted? – Pacerier Jun 14 '12 at 16:23
  • @YatendraGoel: You can still partition your set of single-element-inserts into n-(4K/n)-elements-inserts. – Sebastian Mach Aug 20 '13 at 12:06
8

MySQL's LOAD DATA command might be useful to you: http://dev.mysql.com/doc/refman/5.5/en/load-data.html

With reference to Pascal's suggestion unless your command exceeds max_allowed_packet then you should be able to execute this query. In many cases it works best to create few smaller inserts with say 1000 rows in each.

James C
  • 14,047
  • 1
  • 34
  • 43
6

You can execute your statements in batch, some code example can be found here.

Also, setAutoCommit(false), and call conn.commit(); after executeBatch() to minimise the number of commits.

Community
  • 1
  • 1
Mzq
  • 1,796
  • 4
  • 30
  • 65
2

Insert bulk more than 7000000 record in 1 minutes in database(superfast query with calculation)

    mysqli_query($cons, '
    LOAD DATA LOCAL INFILE "'.$file.'"
    INTO TABLE tablename
    FIELDS TERMINATED by \',\'
    LINES TERMINATED BY \'\n\'
    IGNORE 1 LINES
    (isbn10,isbn13,price,discount,free_stock,report,report_date)
     SET RRP = IF(discount = 0.00,price-price * 45/100,IF(discount = 0.01,price,IF(discount != 0.00,price-price * discount/100,@RRP))),
         RRP_nl = RRP * 1.44 + 8,
         RRP_bl = RRP * 1.44 + 8,
         ID = NULL
    ')or die(mysqli_error());
    $affected = (int) (mysqli_affected_rows($cons))-1; 
    $log->lwrite('Inventory.CSV to database:'. $affected.' record inserted successfully.');

RRP and RRP_nl and RRP_bl is not in csv but we are calculated that and after insert that.

Community
  • 1
  • 1
krunal panchal
  • 417
  • 4
  • 9
0

In mySql you can use load data infile

LOAD DATA INFILE 'C:\MyTextFile'
INTO TABLE myDatabase.MyTable
FIELDS TERMINATED BY ','
ArsalanK
  • 190
  • 1
  • 2
  • 12