5

Some years ago, Thomas Kyte said this (here):

My favorite use of external tables:

Load this really big file ASAP. In the past, setup the scripts to parallel direct path load. Coordinate the firing of said scripts. Review log files to make sure they all finished. Split up input file or use skip and load to slice the file up. In short, a pain in the butt.

Now:

create table ET parallel;
create table new_table (...) parallel
as
select * from ET;

parallel direct path load made trivial.

Except this isn't valid syntax...

How do I use Oracle's external tables to perform a "parallel direct path load"?

Johan
  • 74,508
  • 24
  • 191
  • 319
Rudiger
  • 51
  • 1
  • 2

1 Answers1

6

The parallel capabilities of external tables are somewhat limited. As far as I know, you need to either have several files with the same format that can be processed in parallel (see below) or a single file with a fixed-length format:

CREATE TABLE WORKING_HOURS_EXT
( employee_id       NUMBER(8), 
  project_id        VARCHAR2(20),
  start_time        VARCHAR2(25), 
  end_time         VARCHAR2(25)
) 
ORGANIZATION EXTERNAL 
( 
  TYPE ORACLE_LOADER 
  DEFAULT DIRECTORY loader_data_dir 
  ACCESS PARAMETERS 
  ( 
    records delimited by newline 
    fields terminated by ';' 
    ( employee_id, project_id, start_time, end_time
    ) 
  ) 
  LOCATION ('hours01.txt', 'hours02.txt', 'hours03.txt') 
) 
PARALLEL;

ALTER SESSION ENABLE PARALLEL DML;

MERGE INTO WORKING_HOURS a
USING WORKING_HOURS_EXT b ON (...
Codo
  • 75,595
  • 17
  • 168
  • 206
  • 1
    Example from Codo is really good, thanks . Just to add some points 1. It is not mandatory that multiple files needed to work parallel , even from a single file parallel read is possible. 2. Limitation to the parallel process are like VAR format , file residing on tape drive etc. 3. Once Parallel is defined as part of table all queries will try to run in parallel. I tried one sample file with 8 million records and 15 columns, when i used single file and parallel option it execute faster as compared to using 5 files ( split from the 8 million record file). –  Oct 02 '12 at 10:40