8

I have one page where users can import their contacts. Initially it was working fine upto 3000 contacts but when I tried to import 10000 contacts it started taking too much time and now the situation is that for even 100 contacts it is taking too much time. I tried in mysql my.cnf file and increased the max packet size to 256 mb. My php.ini max upload limit is 512 mb, memory limit is 512 mb. I tried several methods to solve this problem but no success.

my.cnf:

[mysqld]
set-variable = max_connections=500000
log-slow-queries
safe-show-database
local-infile=0
max_allowed_packet=256M

I also tried to increase buffer limit, cache limit but no success there either.

sarnold
  • 102,305
  • 22
  • 181
  • 238
Sujit Tiwari
  • 263
  • 2
  • 5
  • 15
  • What engine? How many rows are currently in the table. – Layke May 28 '11 at 01:22
  • sir its mysql and why you have given negative vote its one of the most critical situation currently if o rown=s are there and if we are importing 3000 rows it works fine but the instant we shal try to import 10000 rows the entire mysql becomes permanently slow for that particular database. even for 100 rows it will take more than 1 hour . i tried all types of alternative and as it is a forum of expert people i expect to be solved but its not an easy dont give -ve vote if you cant solve the problem – Sujit Tiwari May 28 '11 at 01:32
  • @nick: None of his questions have a positive number of votes. I haven't looked through them, but I wouldn't be surprised if there were no acceptable answers to them. – Lightness Races in Orbit May 28 '11 at 01:33
  • @tomalak giving negative vote is easy if you can solve this issue i ll accept your expertise otherwise dont give lecture give solution – Sujit Tiwari May 28 '11 at 01:35
  • 4
    @Sujit: There is no need for tone like that. I was talking to nick in your defence. – Lightness Races in Orbit May 28 '11 at 01:40
  • @Sujit, Tomalak was actually speaking on your behalf, not criticising you. As to Laykes question, What engine as in "InnoDB" or "MyISAM"... – Jonathan Fingland May 28 '11 at 01:41
  • 4
    Question is vague. What is "too much time"? What does performance analyser say? What is the structure of your database? What data are you inserting? What storage engine? – Lightness Races in Orbit May 28 '11 at 01:42
  • @ Tomalak the engine is MyISAN – Sujit Tiwari May 28 '11 at 01:43
  • 1
    @Sujit: Answering just one of my queries in a comment (as opposed to providing _all_ the necessary information in the question) is not helpful. Voting to close for now. – Lightness Races in Orbit May 28 '11 at 01:44
  • i think sometimes server configuration can also make mysql slow . as my server is pentium dual core but i am using 64 bit . i read somewhere that it also affects. my concern is if anyone imports more than approx 3000 contacts then that mysql database becomes permanently sslow and even for 100 contacts it starts taking long time however in general it takes few seconds. – Sujit Tiwari May 28 '11 at 01:47
  • @tomalak sir in my question itself i have given my php.ini and my.cnf all details i dont think any other details required if you ned any details please ask sir – Sujit Tiwari May 28 '11 at 01:49
  • 2
    The number of rows you are working with is absolutely nothing. Mysql with innodb will work for tables with millions of records with no issue. What sort of data are you storing? You must have a massive bottleneck somewhere – Jords May 28 '11 at 01:50
  • @Sujit: I did ask. You failed to provide. – Lightness Races in Orbit May 28 '11 at 01:51
  • When you are trying to insert the 100 and finding it slow, can you check your slow query log. See if it is really taking a long time for mysql to insert each record. You could also try switching to innodb - it is faster for writes. However for this sort of load anything should work. – Jords May 28 '11 at 01:52
  • @tomalak here is the link where i am importing csv files as well as contact of users . see for 300-400 users it takes several minutes against the usual 2-3 seconds. just now i deleted all rows then also it ll take much time – Sujit Tiwari May 28 '11 at 01:53
  • 3
    @Sujit: All of your questions so far have a gross of negative votes (except one, which has a score of 0). I recommend having a read through the Stack Overflow FAQ and also checking out some of the higher-rated questions, to see what is considered a "good" question here. – Lightness Races in Orbit May 28 '11 at 01:55
  • I like when a questioner starts getting belligerent and saying things like "I've given you all the information I think you will need to answer my question." – Hogan May 28 '11 at 03:08

2 Answers2

21

Don't automatically assume that your server settings are wrong. The default settings are probably fine. Inserting 10000 rows should be a piece of cake, even on an old machine, but it depends on how you do your inserts.

Here I'll describe 3 methods for inserting data, ranging from slow to fast:

The following is extremely slow if you have many rows to insert:

INSERT INTO mytable (id,name) VALUES (1,'Wouter');
INSERT INTO mytable (id,name) VALUES (2,'Wouter');
INSERT INTO mytable (id,name) VALUES (3,'Wouter');

This is already a lot faster:

INSERT INTO mytable (id, name) VALUES
  (1, 'Wouter'),
  (2, 'Wouter'),
  (3, 'Wouter');

(Edited wrong syntax)

And this is usually the fastest:

Have CSV file that looks like this:

1,Wouter
2,Wouter
3,Wouter

And then run something like

LOAD DATA FROM INFILE 'c:/temp.csv' INTO TABLE mytable

Which of the above methods do you use?

Karén
  • 215
  • 1
  • 7
Wouter van Nifterick
  • 23,603
  • 7
  • 78
  • 122
  • 1
    +1: great suggestions with good examples. Do you know how many sets of values you can concatenate in the compound insert statement? – IAbstract May 28 '11 at 02:00
  • INSERT INTO mytable (id, name) VALUES this method i am using – Sujit Tiwari May 28 '11 at 02:06
  • Just to be sure: I'm not talking about the "values" syntax. Do you send multiple values at once, or do you send `INSERT INTO ....` for every row? – Wouter van Nifterick May 28 '11 at 02:13
  • 1
    I've modified the example to get rid of the confusion. Sujit, can you answer again with what you see now: do you use the first or second method? Anyway, if it's the first, then start using the second method. Try sending data in blocks of 1000 rows or so like this. That should give your imports a huge boost. – Wouter van Nifterick May 28 '11 at 02:15
  • @wounder multiple values at once. i was knowing no one will be able to solve this question. as before posting this question here i did lot of trial with almost all alternative but i thought some one in this expert forum may help . but i got negative vote . very disappointed – Sujit Tiwari May 28 '11 at 03:30
7

In addition to the suggestions @Wouter provided, you should check your indexes. If you are inserting 10k+ rows, you may want to disable keys prior to the insert process. Don't forget to enable keys afterward.

ref. MySql 5.5 Documentation:

This feature can be activated explicitly for a MyISAM table. ALTER TABLE ... DISABLE KEYS tells MySQL to stop updating nonunique indexes. ALTER TABLE ... ENABLE KEYS then should be used to re-create missing indexes. MySQL does this with a special algorithm that is much faster than inserting keys one by one, so disabling keys before performing bulk insert operations should give a considerable speedup. Using ALTER TABLE ... DISABLE KEYS requires the INDEX privilege...

IAbstract
  • 19,551
  • 15
  • 98
  • 146
  • 6
    +1.. good suggestion. Even though I think Sujit is not listening anymore (he doesn't answer my question), and by looking at his other questions I don't think he'll come back to this page. But I see SO as some sort of wiki. One day somebody is going to arrive on this page via Google, and your answer will be helpful. – Wouter van Nifterick May 28 '11 at 02:26
  • I am finding this page useful ! I have a similar issue. I have to insert/compare about 500 rows of data per user possibly once every 5 days. Found this post very useful. – Prakash Raman Jul 02 '12 at 21:02