0

I need to insert records in MYSQL from a text file, that's not a problem.

The problem is that the text file contains millions of record. This means that processing one text file would results into millions of INSERT query. Also I can't (don't want to) use PHP Infile as I have to perform some data extraction on the text file. Storing that file into a Blob field this is not what I am looking.

Community
  • 1
  • 1
Mukesh Swami
  • 415
  • 2
  • 11
  • What do you mean 'I have to perform some data extraction?'. With the LOAD DATA LOCAL INFILE you can make various modifications to the columns (values) before inserting data. You can split field value into several parts, etc,etc. – Andrew Aug 28 '13 at 07:02

1 Answers1

1

Instead of individual INSERTs use bulk INSERTs like

INSERT INTO a VALUES (1,23),(2,34),(4,33);

Might be worth looking at this question.

Keep in mind that you might hit the packet size limit, look at this question for pointers.

Community
  • 1
  • 1
Sabeen Malik
  • 10,816
  • 4
  • 33
  • 50
  • What time it takes when i increase mysql Packet size ,,hav u an Idea For 1 Millions Records. May be Bulk Insert In Chunks helps me.but i need a idea about What no of records i should insert in 1 query . – Mukesh Swami May 08 '13 at 05:52
  • It depends on the content in the query, so figure out a rough estimate based on a single record and go from there. – Sabeen Malik May 08 '13 at 05:54
  • Thanx for your Reply ,I want to insert 3 fields (Ip & Name & ) from 1 row in Text file ,,,,May be i sud chunk Insert Query Like This ,,Bulk Insert for 10000 records in a query & 10000 Query Like This .... – Mukesh Swami May 08 '13 at 05:57