0

Good day.

We have a requirement to insert around 800000 records in to a custom table which now is taking about 30 + minutes with a simple insert statements. We will read the records from a file and create a .sql file with the simple insert statements. I believe there will be a effective way to insert records apart from the normal way.

I tried the below.

  1. /append/ …. It took more than what it used to take because of commit after every insert statement. So this is ruled out
  2. Insert ALL Into c_table (X,X,X) values (‘ ‘,’ ‘,’ ‘,)
    Into c_table (X,X,X) values (‘ ‘,’ ‘,’ ‘,) Into c_table (X,X,X) values (‘ ‘,’ ‘,’ ‘,) Select 1 from dual;

For a count of 1000 the option 2 is good but for bulk insert its again taking more than 30 min. If any other options please share.

Note: I cannot use sqloader or datapump as we don’t have the database password.

Regards, Prashanth

kanna
  • 101
  • 7
  • 3
    If you don't have the database password, what good does a SQL file do you? You need to log in to the database in order to run the .sql file. 800,000 individual SQL statements none of which use bind variables is going to be terribly slow-- most of your time is likely to be spent parsing 800,000 individual SQL statements. SQL*Loader would be radically more efficient. A simple Java/ C#/ <> app that created a single statement with bind variables and iterated over the 800,000 rows would be much more efficient. – Justin Cave Aug 16 '12 at 00:17
  • @JustinCave: This is set as a batch job process which inserts the records in to custom table. It will be executed by say "X" user for which the password will be encrypted. I believe to use SQL*Loader we need to pass the username/password. Kindly suggest. – kanna Aug 16 '12 at 00:29
  • More on SQL*Loader [here](http://www.orafaq.com/wiki/SQL*Loader_FAQ) – Anjan Biswas Aug 16 '12 at 00:31
  • 2
    800000 rows in 30 minutes. That's 26000 rows per minute, or 444 a second. IMO that's not bad for interpreting SQL statements and running them. What is your time requirement for accomplishing this task? – Bob Jarvis - Слава Україні Aug 16 '12 at 01:07
  • Oracle often has trouble parsing large `INSERT ALL` statements. Try replacing it with a `UNION ALL` method, and breaking the single statement into several smaller statements. See my answer here for some more information: http://stackoverflow.com/a/11663076/409172 – Jon Heller Aug 24 '12 at 05:10

1 Answers1

2

Don't create a sql file from the input data. Instead, create an External table based on the source file and query it - you should be able to run the entire insert with the append option, and it should take much less time than all the individual insert statements.

Jeffrey Kemp
  • 59,135
  • 14
  • 106
  • 158
  • I can't use external tables because the file will be available only in application server and the data directory will be created in DB server. Kindly help me on this. I am totally stuck with this issue. – kanna Aug 21 '12 at 03:27
  • 1
    It is possible to share a folder from the app server to the db server, e.g. using a Samba share. Don't ask me how, though; that's how it's done where I am working at the moment. – Jeffrey Kemp Aug 21 '12 at 07:10