1

I have recently taken data dumps from an Oracle database. Many of them are large in size(~5GB). I am trying to insert the dumped data into another Oracle database by executing the following SQL in SQL Developer

@C:\path\to\table_dump1.sql;
@C:\path\to\table_dump2.sql;
@C:\path\to\table_dump3.sql;
             :

but it is taking a long time like more than a day to complete even a single SQL file.
Is there any better way to get this done faster?

d-_-b
  • 4,142
  • 6
  • 28
  • 43

1 Answers1

2

SQL*Loader is my favorite way to bulk load large data volumes into Oracle. Use the direct path insert option for max speed but understand impacts of direct-path loads (for example, all data is inserted past the high water mark, which is fine if you truncate your table). It even has a tolerance for bad rows, so if your data has "some" mistakes it can still work.

SQL*Loader can suspend indexes and build them all at the end, which makes bulk inserting very fast.

Example of a SQL*Loader call:

$SQLDIR/sqlldr /@MyDatabase direct=false silent=feedback \
    control=mydata.ctl log=/apps/logs/mydata.log bad=/apps/logs/mydata.bad \
    rows=200000

And the mydata.ctl would look something like this:

LOAD DATA
INFILE '/apps/load_files/mytable.dat'
INTO TABLE my_schema.my_able
FIELDS TERMINATED BY "|"
 (ORDER_ID,
  ORDER_DATE,
  PART_NUMBER,
  QUANTITY)

Alternatively... if you are just copying the entire contents of one table to another, across databases, you can do this if your DBA sets up a DBlink (a 30 second process), presupposing your DB is set up with the redo space to accomplish this.

truncate table my_schema.my_table;

insert into my_schema.my_table
select * from my_schema.my_table@my_remote_db;

The use of the /* +append */ hint can still make use of direct path insert.

Hambone
  • 15,600
  • 8
  • 46
  • 69
  • My dump files are just millions of lines of insert SQL statements. Can I still use SQL Loader to load the data? – d-_-b Oct 06 '16 at 04:00
  • No, not exactly... instead of insert statements, you want to change the files to be either fixed-width or delimited files. What is the origin of the data? My guess would be it's easier to extract it as a data file than a series of SQL statements... If not, you could script the SQL into a data file (using Perl, for example) – Hambone Oct 06 '16 at 04:12
  • All I have are files that contains millions of insert statements. I guess I should convert them into csv format or something as you suggested. – d-_-b Oct 07 '16 at 02:52
  • Where did these SQL inserts come from? It seems to me if you hijack that source, you can change the output to whatever you want – Hambone Oct 07 '16 at 02:54