0% found this document useful (0 votes)
106 views53 pages

SQL Loader 31 May 2019

The document discusses using SQL Loader to load data from third party databases into Oracle tables. It provides steps for loading data from an MS Access table into an Oracle table as a case study example. The key steps are: 1) Convert the MS Access table to a comma-delimited flat file, 2) Create the target Oracle table with the same structure, 3) Write a SQL Loader control file specifying the flat file location and field mappings, 4) Run SQL Loader to load the data using the control file. The document also compares conventional path loading vs direct path loading in SQL Loader.

Uploaded by

Raja Sekhar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
106 views53 pages

SQL Loader 31 May 2019

The document discusses using SQL Loader to load data from third party databases into Oracle tables. It provides steps for loading data from an MS Access table into an Oracle table as a case study example. The key steps are: 1) Convert the MS Access table to a comma-delimited flat file, 2) Create the target Oracle table with the same structure, 3) Write a SQL Loader control file specifying the flat file location and field mappings, 4) Run SQL Loader to load the data using the control file. The document also compares conventional path loading vs direct path loading in SQL Loader.

Uploaded by

Raja Sekhar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 53

1

SQL LOADER utility is used to load data from other data source into Oracle. For example, if you
have a table in FOXPRO, ACCESS or SYBASE or any other third party database, you can use SQL
Loader to load the data into Oracle Tables. SQL Loader will only read the data from Flat files. So
If you want to load the data from Foxpro or any other database, you have to first convert that
data into Delimited Format flat file or Fixed length format flat file, and then use SQL loader to
load the data into Oracle.

Following is procedure to load the data from Third Party Database into Oracle using SQL Loader.

1. Convert the Data into Flat file using third party database command.
2. Create the Table Structure in Oracle Database using appropriate datatypes
3. Write a Control File, describing how to interpret the flat file and options to load the
data.
4. Execute SQL Loader utility specifying the control file in the command line argument

To understand it better let us see the following case study.

CASE STUDY (Loading Data from MS-ACCESS to Oracle)

Suppose you have a table in MS-ACCESS by name EMP, running under Windows O/S, with the
following structure

EMPNO INTEGER
NAME TEXT(50)
SAL CURRENCY
JDATE DATE

This table contains some 10,000 rows. Now you want to load the data from this table into an
Oracle Table. Oracle Database is running in LINUX O/S.

Solution

Step 1

Start MS-Access and convert the table into comma delimited flat (popularly known as csv) ,
by clicking on File/Save As menu. Let the delimited file name be emp.csv

Now transfer this file to Linux Server using FTP command

a. Go to Command Prompt in windows


b. At the command prompt type FTP followed by IP address of the server running
Oracle. 

FTP will then prompt you for username and password to connect to the Linux

1
2

Server. Supply a valid username and password of Oracle User in Linux

For example:-
c. C:\> ftp 200.200.100.111
Name: oracle

Password: oracle
FTP>

d. Now give PUT command to transfer file from current Windows machine to Linux
machine
e. FTP>put
Local file:C:\>emp.csv
remote-file:/u01/oracle/emp.csv
f.
g. File transferred in 0.29 Seconds
FTP>

h. Now after the file is transferred quit the FTP utility by typing bye command.

FTP>bye
Good-Bye

Step 2

Now come to the Linux Machine and create a table in Oracle with the same structure as in
MS-ACCESS by taking appropriate datatypes. For example,  create a table like this

$ sqlplus scott/tiger
SQL> CREATE TABLE emp (empno number(5),
name varchar2(50),
sal  number(10,2),
jdate date);

Step 3

After creating the table, you have to write a control file describing the actions which SQL
Loader should do. You can use any text editor to write the control file. Now let us write a
controlfile for our case study

$ vi emp.ctl

1 LOAD DATA

2
3

2 INFILE ‘/u01/oracle/emp.csv’
3 BADFILE ‘/u01/oracle/emp.bad’
4 DISCARDFILE ‘/u01/oracle/emp.dsc’
5 INSERT INTO TABLE emp
6 FIELDS TERMINATED BY “,” OPTIONALLY ENCLOSED BY ‘”’ TRAILING NULLCOLS
7 (empno,name,sal,jdate date ‘mm/dd/yyyy’)

Notes: (Do not write the line numbers, they are meant for explanation purpose)

1.       The LOAD DATA statement is required at the beginning of the control file.
2.       The INFILE option specifies where the input file is located 
3.       Specifying BADFILE is optional. If you specify,  then bad records found during loading
will be stored in this file.
4.       Specifying DISCARDFILE is optional. If you specify, then records which do not meet a
WHEN condition will be written to this file. 
5.       You can use any of the following loading option
    i.       INSERT : Loads rows only if the target table is empty
    ii.       APPEND: Load rows if the target table is empty or not.
    iii.      REPLACE: First deletes all the rows in the existing table and then, load rows.
    iv.      TRUNCATE: First truncates the table and then load rows.
6.       This line indicates how the fields are separated in input file. Since in our case the
fields are separated by “,” so we have specified “,” as the terminating char for fields. You
can replace this by any char which is used to terminate fields. Some of the popularly use
terminating characters are semicolon “;”, colon “:”, pipe “|” etc. TRAILING NULLCOLS
means if the last column is null then treat this as null value, otherwise,  SQL LOADER will
treat the record as bad if the last column is null.
7.        In this line specify the columns of the target table. Note how do you specify format
for Date columns

Step 4

After you have wrote the control file save it and then, call SQL Loader utility by typing the
following command

$sqlldr userid=scott/tiger control=emp.ctl log=emp.log

After you have executed the above command SQL Loader will shows you the output describing
how many rows it has loaded. 

The LOG option of sqlldr specifies where the log file of this sql loader session should be created.
The log file contains all actions which SQL loader has performed i.e. how many rows were
loaded, how many were rejected and how much time is taken to load the rows and etc. You
have to view this file for any errors encountered while running SQL Loader.

3
4

Conventional Path Load and Direct Path Load.

SQL Loader can load the data into Oracle database using Conventional Path method or Direct
Path method. You can specify the method by using DIRECT command line option. If you give
DIRECT=TRUE then SQL loader will use Direct Path Loading otherwise, if omit this option or
specify DIRECT=false, then SQL Loader will use Conventional Path loading method.

Conventional Path

Conventional path load (the default) uses the SQL INSERT statement and a bind array buffer to
load data into database tables.

When SQL*Loader performs a conventional path load, it competes equally with all other
processes for buffer resources. This can slow the load significantly. Extra overhead is added as
SQL statements are generated, passed to Oracle, and executed.

The Oracle database looks for partially filled blocks and attempts to fill them on each insert.
Although appropriate during normal use, this can slow bulk loads dramatically.

Direct Path

In Direct Path Loading, Oracle will not use SQL INSERT statement for loading rows. Instead it
directly writes the rows, into fresh blocks beyond High Water Mark, in datafiles i.e. it does not
scan for free blocks before high water mark. Direct Path load is very fast because

 Partial blocks are not used, so no reads are needed to find them, and fewer writes are
performed.
 SQL*Loader need not execute any SQL INSERT statements; therefore, the processing
load on the Oracle database is reduced.
 A direct path load calls on Oracle to lock tables and indexes at the start of the load and
releases them when the load is finished. A conventional path load calls Oracle once for
each array of rows to process a SQL INSERT statement.
 A direct path load uses multiblock asynchronous I/O for writes to the database files.
 During a direct path load, processes perform their own write I/O, instead of using
Oracle's buffer cache. This minimizes contention with other Oracle users.

Restrictions on Using Direct Path Loads

The following conditions must be satisfied for you to use the direct path load method:

 Tables are not clustered.


 Tables to be loaded do not have any active transactions pending.

4
5

 Loading a parent table together with a child Table


 Loading BFILE columns

10 Oracle SQLLDR Command Examples


(Oracle SQL*Loader Tutorial)
by RAMESH NATARAJAN on JUNE 25, 2012

Tweet

If you are using Oracle database, at some


point you might have to deal with uploading data to the tables from a text file.

This article provides 10 practical examples on how to upload data from a flat file to
Oracle tables.

Input data file for SQL*Loader


This is the input text file that contains the data that needs to be loaded into an oracle
table. Each and every records needs to be in a separate line, and the column values
should be delimited by some common delimiter character. For some of the examples
mentioned below, we’ll use the following employee.txt file to upload the data to the
employee table.

5
6

$ cat employee.txt

100,Thomas,Sales,5000

200,Jason,Technology,5500

300,Mayla,Technology,7000

400,Nisha,Marketing,9500

500,Randy,Technology,6000

501,Ritu,Accounting,5400

SQL*Loader Control File


This contains the instructions to the sqlldr utility. This tells sqlldr the location of the
input file, the format of the input file, and other optional meta data information
required by the sqlldr to upload the data into oracle tables.

$ cat example1.ctl

load data

infile '/home/ramesh/employee.txt'

into table employee

fields terminated by ","

( id, name, dept, salary )

6
7

The above control file indicates the following:

 infile – Indicates the location of the input data file


 into table – Indicates the table name where this data should be inserted
 fields terminated by – Indicates the delimiter that is used in the input file to separate the
fields
 ( id, name, dept, salary ) – Lists the name of the column names in the table into which
the data should be uploaded

1. Basic Upload Example Using SQL*Loader


First, create the employee table as shown below.

SQL> create table employee

id integer,

name varchar2(10),

dept varchar2(15),

salary integer,

hiredon date

Next create the control file that explains what needs to be upload and where.

7
8

$ cat sqlldr-add-new.ctl

load data

infile '/home/ramesh/employee.txt'

into table employee

fields terminated by ","

( id, name, dept, salary )

Note: If you have the values inside the data file enclosed with double quote, use this in
your control file: fields terminated by “,” optionally enclosed by ‘”‘

Note: If you don’t have the table created, you’ll get the following error message:

SQL*Loader-941: Error during describe of table EMPLOYEE

ORA-04043: object EMPLOYEE does not exist

You can pass the userid and password to the sqlldr command using any one of the
following format. As you see below, both of these will prompt you for control file
location, as it was not given in the command line.

$ sqlldr scott/tiger

8
9

(or)

$ sqlldr userid=scott/tiger

control =

SQL*Loader-287: No control file name specified.

Execute the sqlldr command to upload these new record to the empty table by specifying
both uid/pwd and the control file location as shown below.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-add-new.ctl

Commit point reached - logical record count 5

Verify the the records are created in the database

SQL> select * from employee;

ID NAME DEPT SALARY HIREDON

---------- ---------- --------------- ---------- -------

100 Thomas Sales 5000

9
10

200 Jason Technology 5500

300 Mayla Technology 7000

400 Nisha Marketing 9500

500 Randy Technology 6000

This will create the output log file in the same name as the data file, but with the .log
extension (instead of .ctl). Partial output shown below.

$ cat sqlldr-add-new.log

Control File: /home/ramesh/sqlldr-add-new.ctl

Data File: /home/ramesh/employee.txt

Table EMPLOYEE:

5 Rows successfully loaded.

0 Rows not loaded due to data errors.

0 Rows not loaded because all WHEN clauses were failed.

0 Rows not loaded because all fields were null.

10
11

Elapsed time was: 00:00:00.04

CPU time was: 00:00:00.00

If you are new to Oracle database, and like to install it, follow this Oracle 11g installation
guide.

2. Inserting Additional Records


Let us say you want to add two new employees to the employee table from the following
newemployee.txt file.

$ vi newemployee.txt

600,Ritu,Accounting,5400

700,Jessica,Marketing,7800

If you create a similar control file like the previous example, you might get the following
error message.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-add-more.ctl

SQL*Loader-601: For INSERT option, table must be empty. Error on table EMPLOYEE

The above indicates that the table should be empty before you can upload data using
sql*loader.

11
12

If you like to insert more data to the tables without having to delete the existing rows,
use the “append’ command as shown in the following control file.

$ vi sqlldr-append-more.ctl

load data

infile '/home/ramesh/newemployee.txt'

append

into table employee

fields terminated by ","

( id, name, dept, salary )

Now, if you do sqlldr this will append the data.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-append-more.ctl

Commit point reached - logical record count 2

Verify that the records are appended successfully

SQL> select * from employee;

12
13

ID NAME DEPT SALARY HIREDON

---------- ---------- --------------- ---------- -------

100 Thomas Sales 5000

200 Jason Technology 5500

300 Mayla Technology 7000

400 Nisha Marketing 9500

500 Randy Technology 6000

600 Ritu Accounting 5400

700 Jessica Marketing 7800

3. Data inside the Control File using BEGINDATA


You can also specify the data directly inside the control file itself using BEGINDATA
keyword. i.e Anything that comes after BEGINDATA will be treated as data to be
uploaded to the table as shown below.

$ cat sqlldr-add-new-with-data.ctl

load data

infile *

into table employee

13
14

fields terminated by ","

( id, name, dept, salary )

begindata

100,Thomas,Sales,5000

200,Jason,Technology,5500

300,Mayla,Technology,7000

400,Nisha,Marketing,9500

500,Randy,Technology,6000

Note: The infile will say ‘*’ in this case, as there is no input data file name for this
example.

Execute sqlldr to upload the data from the control file.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-add-new-with-data.ctl

4. Date format and Different Delimiter


This example shows how to specify a date format in the control file and how to handle
different delimiters in a data file

The following example has different delimiters ($ after name, ^ after department).

14
15

$ cat employee-date.txt

100,Thomas$Sales^5000,31-JAN-2008

200,Jason$Technology^5500,01-Feb-2005

300,Mayla$Technology^7000,10-Aug-2000

400,Nisha$Marketing^9500,12-Dec-2011

500,Randy$Technology^6000,01-JAN-2007

Create the following control file and indicate the field delimiters for each and every field
using “terminated by” as shown below.

$ cat sqlldr-date.ctl

load data

infile '/home/ramesh/employee-date.txt'

into table employee

fields terminated by ","

( id, name terminated by "$", dept terminated by "^", salary, hiredon DATE "dd-mon-
yyyy" )

Load the data using sqlldr as shown below.

15
16

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-date.ctl

Verify that the data got loaded properly as shown below.

SQL> select * from employee;

ID NAME DEPT SALARY HIREDON

---------- ---------- --------------- ---------- ---------

100 Thomas Sales 5000 31-JAN-08

200 Jason Technology 5500 01-FEB-05

300 Mayla Technology 7000 10-AUG-00

400 Nisha Marketing 9500 12-DEC-11

500 Randy Technology 6000 01-JAN-07

5. Fixed Length Data Upload


If you have a data file without data that are fixed length (i.e without any delimiter), you
can use this example to upload this data.

For this example, let us use the following file which has data that are of fixed length. For
example, 1st three characters are always employee number, Next 5 characters are always
employee name, etc.

16
17

$ cat employee-fixed.txt

200JasonTechnology5500

300MaylaTechnology7000

400NishaTechnology9500

500RandyTechnology6000

Create the following control file, where you specific the position of each and every field
as shown below usig the “Position(start:end)” syntax.

$ cat sqlldr-fixed.ctl

load data

infile '/home/ramesh/employee-fixed.txt'

into table employee

fields terminated by ","

( id position(1:3), name position(4:8), dept position(9:18), salary


position(19:22) )

Load this fixed length data using the sqlldr as shown below.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-fixed.ctl

17
18

Verify that the data got loaded.

SQL> select * from employee;

ID NAME DEPT SALARY HIREDON

---------- ---------- --------------- ---------- ---------

200 Jason Technology 5500

300 Mayla Technology 7000

400 Nisha Technology 9500

500 Randy Technology 6000

6. Change the data during upload


You can also massage the data and change it during upload based on certain rules.

In the following control file:

 id is incremented by 999 before uploading. i.e if the emp id is 100 in the data file, it will
be loaded as 1099
 Convert the name to upper case and load it. This uses the upper function.
 If the department contains the value “Technology” change it to “Techies”. This uses
decode function

$ cat sqlldr-change-data.ctl

18
19

load data

infile '/home/ramesh/employee.txt'

into table employee

fields terminated by ","

( id ":id+999",

name "upper(:name)",

dept "decode(:dept,'Technology','Techies', :dept)",

salary

Load the data using this control file which will massage the data before uploading it.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-change-data.ctl

Verify that the data got changed while loading as per our rules.

SQL> select * from employee;

ID NAME DEPT SALARY HIREDON

19
20

---------- ---------- --------------- ---------- ---------

1099 THOMAS Sales 5000

1199 JASON Techies 5500

1299 MAYLA Techies 7000

1399 NISHA Marketing 9500

1499 RANDY Techies 6000

7. Load data from multiple files


To load data from multiple files, you just have to specify multiple infile in the control
file.

The following control file loads data from two different data files (employee.txt and
newemployee.txt) to the employee table.

$ sqlldr-add-multiple.ctl

load data

infile '/home/ramesh/employee.txt'

infile '/home/ramesh/newemployee.txt'

into table employee

fields terminated by ","

20
21

( id, name, dept, salary )

Load the data using this control file which will upload data from multiple data files as
shown below.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-add-multiple.ctl

Commit point reached - logical record count 5

Commit point reached - logical record count 7

8. Load data to Multiple Tables


Create another table called bonus which will have employee id and bonus columns.

create table bonus

( id integer,

bonus integer

);

Create the employee-bonus.txt data file that contains the fields: id, name, department,
salary, bonus

$ cat employee-bonus.txt

100 Thomas Sales 5000 1000

21
22

200 Jason Technology 5500 2000

300 Mayla Technology 7000 2000

400 Nisha Marketing 9500 1000

500 Randy Technology 6000 3000

Create the control file as shown below, which will upload the data from the above file to
two different tables. As shown below, you should have two “into table” commands, and
specify the position of the data which needs to be used to upload the data to that
column.

$ cat sqlldr-multiple-tables.ctl

load data

infile '/home/ramesh/employee-bonus.txt'

into table employee

( id position(1:3),

name position(5:10),

dept position(12:21),

salary position(23:26))

into table bonus

22
23

( id position(1:3),

bonus position(28:31))

Load the data to multiple tables using this control file as shown below.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-multiple-tables.ctl

Verify that the data got loaded to multiple tables successfully.

SQL> select * from employee;

ID NAME DEPT SALARY HIREDON

---------- ---------- --------------- ---------- ---------

100 Thomas Sales 5000

200 Jason Technology 5500

300 Mayla Technology 7000

400 Nisha Marketing 9500

500 Randy Technology 6000

23
24

SQL> select * from bonus;

ID BONUS

---------- ----------

100 1000

200 2000

300 2000

400 1000

500 3000

9. Handling Bad (Rejected) Records


In the following example, we have two bad records. Employee id 300 and 500 has salary
column which is not numeric.

$ cat employee-bad.txt

100,Thomas,Sales,5000

200,Jason,Technology,5500

300,Mayla,Technology,7K

24
25

400,Nisha,Marketing,9500

500,Randy,Technology,6K

Use the following control file for this example.

$ cat sqlldr-bad.ctl

load data

infile '/home/ramesh/employee-bad.txt'

into table employee

fields terminated by ","

( id, name, dept, salary )

Load the data (including the invalid records) using this control file as shown below.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-bad.ctl

Commit point reached - logical record count 5

As you see from the abvoe output, it still says “logical record count 5”, but you should
check the log files to see if it has rejected any records.

The log file indicates that 2 records are rejected as shown below:

25
26

Control File: /home/ramesh/sqlldr-bad.ctl

Data File: /home/ramesh/employee-bad.txt

Bad File: /home/ramesh/employee-bad.bad

Discard File: none specified

Table EMPLOYEE:

3 Rows successfully loaded.

2 Rows not loaded due to data errors.

By default the rejected records are stored in a file that has the same name as the data file
(but with .bad extension)

$ cat employee-bad.bad

300,Mayla,Technology,7K

500,Randy,Technology,6K

As you see below, the employee table has only 3 records (as 2 of them were rejected).

SQL> select * from employee;

26
27

ID NAME DEPT SALARY HIREDON

---------- ---------- --------------- ---------- ---------

100 Thomas Sales 5000

200 Jason Technology 5500

400 Nisha Marketing 9500

10. Load Specific Rows from a datafile


If you want to load only a specific records from a data file use the WHEN in the control
file.

Add the line “when” next to “into table” line. In the following control file, the when
clause indicates that it will load only the records that have dept as “Technology”.

$ cat sqlldr-when.ctl

load data

infile '/home/ramesh/employee.txt'

into table employee

when dept = 'Technology'

fields terminated by ","

27
28

( id, name, dept, salary )

Load the selective data (only the “Technology” records) using this control file as shown
below.

$ sqlldr scott/tiger control=/home/ramesh/sqlldr-when.ctl

Commit point reached - logical record count 5

As you see from the above output, it still says “logical record count 5”, but you should
check the log files to see how many records were loaded, and how many records were
discarded because it didn’t match the when condition.

The following from the log file shows that 5 records were read, and 2 of them were
discarded as it didn’t match the when condition.

Discard File: none specified

Total logical records read: 5

Total logical records discarded: 2

Verify that only the selective records were loaded into the table.

SQL> select * from employee;

28
29

ID NAME DEPT SALARY HIREDON

---------- ---------- --------------- ---------- ---------

200 Jason Technology 5500

300 Mayla Technology 7000

500 Randy Technology 6000

 Prithviraj August 6, 2012, 5:45 am

@Rohit,
This is not possible using SQL loaders. Because data type of VendorId is numeric and
from infile you are getting characters. You have two approches here:
1) Either you process your infile first and replace X by 1, y by 2, z by 3. Then use SQL
loader using when condition to check what vendoe id is it and amount column should
take what value. OR
2) You need to use external tables and SQL functions in this scenario.
Refer:
http://asktom.oracle.com/pls/asktom/f?
p=100:11:0::::P11_QUESTION_ID:1710164700346004127
LINK

 Rohit K August 6, 2012, 9:36 am

Thank You Prithviraj .

We can do it using a control file this way.

29
30

I tried using the boundfiller,decodes and my CONTROL file will look like this:
INFILE=’sale_exec.dat’
APPEND
PRESERVE BLANKS
INTO TABLE SALES
FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘”‘ TRAILING NULLCOLS
(
c1 BOUNDFILLER,
REPORT_DATE,
c2 BOUNDFILLER,
c3 BOUNDFILLER,
VENDORID “to_number(DECODE(:c1,’X’,1,’Y’,2,3))”,
AMOUNT “to_number(DECODE(:c1,’Z’,c3,c2))”
)
This worked good for me.

Thank You once again.

Regards,
Rohit K

 Uday November 28, 2013, 5:29 am

I have a different scenario. I have a table with 5 columns, c1, c2, c3, c4, c5 and a csv file
has 6 columns, a,c1,c2,c3,c4,c5. I would like to load c1 to c5 columns data from the csv
file to c1 to c5 columns in the table. Can we skip columns any columns from csv or can
we map csv columns to table columns in the loader control file ?

LINK

 Uday November 28, 2013, 5:55 am

30
31

I got it. Keeping FILLER keyword to the right of column name would skip the column:

For e.g.:

OPTIONS (SKIP=1)
LOAD DATA
INFILE ‘source.csv’
BADFILE ‘source.csv.bad’
DISCARDFILE ‘source.csv.dsc’
APPEND
INTO TABLE tab_name
FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘”‘
TRAILING NULLCOLS
( a FILLER
,c1
,c2
,c3
,c4
,c5
)

General

31
32

Note: This page consists of a series of demonstrations of various SQL*Loader capabilities. It is by no


means complete.

For the Oracle doc:


http://download-west.oracle.com/docs/cd/B19306_01/server.102/b14215/app_ldr_syntax.htm#i631434
CHAR
SQL Loader Data Types DECIMAL EXTERNAL
INTEGER EXTERNAL
APPEND
INSERT
Modes
REPLACE
TRUNCATE
INFILE * or INFILE '<file_name>'
INFILE [RECSIZE <integer> BUFFERS <integer>]
INFILE 'mydata.dat' "RECSIZE 80 BUFFERS 8"
INTO <table_name>
INTO
INTO TABLE emp
BADFILE BADFILE '<file_name>'
Records with formatting
errors or that cause BADFILE 'sample.bad'
Oracle errors
DISCARDFILE DISCARDFILE '<file_name>'
DISCARDMAX <integer>
Records not satisfying a
WHEN clause DISCARDFILE 'sample.dsc'
CHARACTERSET <character_set_name>
CHARACTERSET
CHARACTERSET WE8MSWIN1252
LENGTH SEMANTICS <BYTE | CHAR>
LENGTH LENGTH SEMANTICS BYTE
-- this is the default for all character sets except UTF16
APPEND
INSERT
LOAD TYPES REPLACE
TRUNCATE
APPEND
OPTIONS CLAUSE BINDSIZE = n
COLUMNARRAYROWS = n
DIRECT = {TRUE | FALSE} 
ERRORS = n
LOAD = n 
MULTITHREADING = {TRUE | FALSE}
PARALLEL = {TRUE | FALSE}

32
33

READSIZE = n
RESUMABLE = {TRUE | FALSE}
RESUMABLE_NAME = 'text string'
RESUMABLE_TIMEOUT = n
ROWS = n 
SILENT = {HEADER | FEEDBACK | ERRORS | DISCARDS |
PARTITIONS | ALL}
SKIP = n 
SKIP_INDEX_MAINTENANCE = {TRUE | FALSE}
SKIP_UNUSABLE_INDEXES = {TRUE | FALSE}
STREAMSIZE = n
OPTIONS (BINDSIZE=100000, SILENT=(ERRORS, FEEDBACK))
CONVENTIONAL PATH
DIRECT PATH
PATHS
All loads demonstrated below are convention with the exception of
demo 6.
 
Comm
','
TERMINATORS a
Tab 0x'09'
TRAILING NULLCOLS
-- assuming this data
10 Accounting

-- the following
INTO TABLE dept 
TRAILING NULLCOLS TRAILING NULLCOLS 
( deptno CHAR TERMINATED BY " ", 
dname CHAR TERMINATED BY WHITESPACE, 
loc CHAR TERMINATED BY WHITESPACE) 

-- would generate an error without TRAILING NULLCOLS 


-- as it doesn't have loc data
WHEN <condition>
WHEN
See Demo 5 below
 
Assembling Logical Records
CONCATENATE <number_of_physical_records>
CONCATENATE
CONCATENATE 3
CONTINUEIF CONTINUEIF THIS [PRESERVE] (start_position:end_position) =

33
34

value
CONTINUEIF THIS (1:2) = '%%'

CONTINUEIF THIS PRESERVE (1:2) = '%%'


CONTINUEIF NEXT [PRESERVE] (start_position:end_position) =
value
CONTINUEIF CONTINUEIF NEXT (1:2) = '%%'

CONTINUEIF NEXT PRESERVE (1:2) = '%%'


CONTINUEIF LAST (start_position:end_position) = value
CONTINUEIF -- Tests against the last non-blank character.
-- Allows only a single character for the test
PRESERVE Preserves the CONTINUEIF characters
 
Demo Tables & Data
CREATE TABLE dept (
Demo Tables deptno   VARCHAR2(2),
dname    VARCHAR2(20),
loc      VARCHAR2(20));

CREATE TABLE emp (
empno    NUMBER(4),
ename    VARCHAR2(10),
job      VARCHAR2(10),
mgr      NUMBER(4),
hiredate DATE,
sal      NUMBER(8,2),
comm     NUMBER(7,2),
deptno   NUMBER(2),
projno   NUMBER(4),
loadseq  NUMBER(3)); 

CREATE TABLE proj (
emp      NUMBER(4),
projno   NUMBER(3)); 

CREATE TABLE funcdemo (
last_name  VARCHAR2(20),
first_name VARCHAR2(20));

CREATE TABLE decodemo (
fld1    VARCHAR2(20),
fld2    VARCHAR2(20));

34
35

CREATE TABLE denver_prj (
projno  VARCHAR2(3),
empno   NUMBER(5),
projhrs NUMBER(2));

CREATE TABLE orlando_prj (
projno  VARCHAR2(3),
empno   NUMBER(5),
projhrs NUMBER(2));

CREATE TABLE misc_prj (
projno  VARCHAR2(3),
empno   NUMBER(5),
projhrs NUMBER(2));

CREATE TABLE po_tab OF XMLTYPE;

CREATE TABLE loadnums(
col1 VARCHAR2(10),
col2 NUMBER);
 
Demo 1
Basic import of delimited data with data in the control file
OPTIONS (ERRORS=500, SILENT=(FEEDBACK))
LOAD DATA
INFILE *
INTO TABLE <table_name>
Control File FIELDS TERMINATED BY <delimiter>
OPTIONALLY ENCLOSED BY <enclosing character>
(<column_name>, <column_name>, <column_name>)
sqlldr userid=uwclass/uwclass control=c:\load\demo01.ctl
log=d:\load\demo01.log
 
Demo 2
Basic import of fixed length data with separate data and control files
Control File LOAD DATA
INFILE <data_file_path_and_name>
Data File
INTO TABLE <table_name> (
<column_name> POSITION(<integer>:<integer>) <data_type>,
<column_name> POSITION(<integer>:<integer>) <data_type>,
<column_name> POSITION(<integer>:<integer>) <data_type>)
sqlldr userid=uwclass/uwclass control=c:\load\demo02.ctl

35
36

log=c:\load\demo02.log
 
Demo 3
Append of delimited data with data in the control file. This sample demonstrates date formating,
delimiters within delimiters and implementation of record numbering with a SQL*Loader sequence.
APPEND indicates that the table need not be empty before the SQL*Loader is run.
LOAD DATA
INFILE *
APPEND
INTO TABLE emp
FIELDS TERMINATED BY ","
Control File OPTIONALLY ENCLOSED BY '"'
(<column_name>, <column_name> DATE "DD-Month-YYYY",
<column_name> CHAR TERMINATED BY ':',
<column_name> SEQUENCE(MAX,1))
sqlldr userid=uwclass/uwclass control=c:\load\demo03.ctl
log=c:\load\demo3.log
 
Demo 4
Replace of fixed length data with separate data and control file. This sample demonstrates specifying a
discard file, the maximum number of records to discard (DISCARDMAX), and CONTINUEIF ( where it
looks for an asterisk in the first position to determine if a new line has started.
LOAD DATA
INFILE 'c: emp\demo04.dat'
DISCARDFILE 'c: emp\demo4.dsc'
DISCARDMAX 999
REPLACE
Control File CONTINUEIF THIS (1) = '*'
INTO TABLE emp (
Data File
empno    POSITION(1:4)   INTEGER EXTERNAL,
ename    POSITION(6:15)  CHAR,
hiredate POSITION(52:60) INTEGER EXTERNAL)
sqlldr userid=uwclass/uwclass control=c:\load\demo04.ctl
log=c:\load\demo4.log
 
Demo 5
Loading into multiple tables during an import using the WHEN keyword. The control file loads two different
tables making three passes at one of them. Note the problem with the Doolittle record and how it is handled.
Control File LOAD DATA
INFILE 'c: emp\demo05.dat'
Data File
BADFILE 'c: emp\bad05.bad'
DISCARDFILE 'c: emp\disc05.dsc'
REPLACE

36
37

INTO TABLE emp (
empno  POSITION(1:4)   INTEGER EXTERNAL,
ename  POSITION(6:15)  CHAR,
deptno POSITION(17:18) CHAR,
mgr    POSITION(20:23) INTEGER EXTERNAL)

--1st project: proj has two columns, both not null


INTO TABLE proj
WHEN projno != ' ' (
emp    POSITION(1:4)   INTEGER EXTERNAL,
projno POSITION(25:27) INTEGER EXTERNAL)

-- 2nd project
INTO TABLE proj
WHEN projno != ' ' (
emp    POSITION(1:4)   INTEGER EXTERNAL,
projno POSITION(29:31) INTEGER EXTERNAL)

-- 3rd project
INTO TABLE proj
WHEN projno != ' ' (
emp    POSITION(1:4)   INTEGER EXTERNAL,
projno POSITION(33:35) INTEGER EXTERNAL)
sqlldr userid=uwclass/uwclass control=c:\load\demo5.ctl
log=d:\load\demo5.log
 
Demo 6
Using the NULLIF and BLANKS keywords to handle zero length strings being loaded into numeric
columns. Also note the use of Direct Path Load in the control file (DIRECT=TRUE).
Control File LOAD DATA
INFILE 'c: emp\demo06.dat'
Data File
INSERT
INTO TABLE emp
-- SORTED INDEXES (emp_empno)
(
empno  POSITION(01:04)
INTEGER EXTERNAL NULLIFempno=BLANKS,
ename  POSITION(06:15) CHAR,
job    POSITION(17:25) CHAR,
mgr    POSITION(27:30)
INTEGER EXTERNAL NULLIF mgr=BLANKS,
sal    POSITION(32:39) DECIMAL
EXTERNAL NULLIF sal=BLANKS,
comm   POSITION(41:48) DECIMAL

37
38

EXTERNAL NULLIF comm=BLANKS,
deptno
POSITION(50:51) INTEGER EXTERNAL NULLIFdeptno=BLANKS)
sqlldr userid=uwclass/uwclass control=c:\load\demo06.ctl
log=c:\load\demo06.log DIRECT=TRUE
 
Demo 7
Using a buit-in function to modify data during loading
LOAD DATA
INFILE *
INSERT
INTO TABLE funcdemo
(
LAST_NAME  position(1:7)  CHAR "UPPER(:LAST_NAME)",
Control File FIRST_NAME position(8:15) CHAR "LOWER(:FIRST_NAME)"
)
BEGINDATA
Locke Phil
Gorman Tim
sqlldr userid=uwclass/uwclass control=c:\load\demo07.ctl
log=c:\load\demo07.log
 
Demo 8
Another example of using a built-in function, in this case DECODE, to modify data during loading
LOAD DATA
INFILE *
INSERT
INTO TABLE decodemo
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(
fld1, 
Control File fld2 "DECODE(:fld1, 'hello', 'goodbye', :fld1)"
)
BEGINDATA
hello,""
goodbye,""
this is a test,""
hello,""
sqlldr userid=uwclass/uwclass control=c:\load\demo08.ctl
log=c:\load\demo08.log
 

38
39

Demo 9
Loading multiple files into multiple tables in a singe control file. Note the use of the WHEN keyword.
LOAD DATA
INFILE 'c: emp\demo09a.dat'
INFILE 'c: emp\demo09b.dat' 
APPEND
INTO TABLE denver_prj
WHEN projno = '101' (
projno  position(1:3)  CHAR,
empno   position(4:8)  INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)
Control File
INTO TABLE orlando_prj
Data File WHEN projno = '202' (
projno  position(1:3)  CHAR,
Data File empno   position(4:8)  INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)

INTO TABLE misc_prj
WHEN projno != '101' AND projno != '202' (
projno  position(1:3)  CHAR,
empno   position(4:8)  INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)
sqlldr userid=uwclass/uwclass control=c:\load\demo09.ctl
log=c:\load\demo09.log
 
Demo 10
Loading negative numeric values. Note Clark and Miller's records in the data file. Note empty row
LOAD DATA
INFILE 'c: emp\demo10.dat'
INTO TABLE emp
REJECT ROWS WITH ALL NULL FIELDS
(
empno  POSITION(01:04) INTEGER EXTERNAL,
Control File ename  POSITION(06:15) CHAR,
job    POSITION(17:25) CHAR,
Data File
mgr    POSITION(27:30) INTEGER EXTERNAL,
sal    POSITION(32:39) DECIMAL EXTERNAL,
comm   POSITION(41:48) DECIMAL EXTERNAL,
deptno POSITION(50:51) INTEGER EXTERNAL)
sqlldr userid=uwclass/uwclass control=c:\load\demo10.ctl
log=c:\load\demo10.log
 

39
40

Demo 11
Loading XML
LOAD DATA
INFILE *
INTO TABLE po_tab
APPEND
XMLTYPE (xmldata)
FIELDS
(xmldata CHAR(2000))
desc po_tab

Control File sqlldr userid=uwclass/uwclass control=c:\load\demo11.ctl


log=c:\load\demo11.log

set long 1000000

SELECT * FROM po_tab;

SELECT *
FROM po_tab
WHERE sys_nc_rowinfo$ LIKE '%Hurry%';
 
Demo 12
Loading a CONSTANT, RECNUM, and SYSDATE
OPTIONS (ERRORS=100, SILENT=(FEEDBACK))
LOAD DATA
INFILE *
REPLACE
INTO TABLE dept
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(recno RECNUM, deptno CONSTANT "XX", dname, loc,
Control File
tdateSYSDATE)
ALTER TABLE dept
ADD (recno NUMBER(5), tdate DATE);

desc dept

sqlldr userid=uwclass/uwclass control=c:\load\demo12.ctl


log=c:\load\demo12.log
 
Demo 13
Setting READSIZE and BINDSIZE

40
41

LOAD DATA
The control file and data INFILE 'c: emp\cust1v3.dat'
for 
this demo can be found
INTO TABLE CUSTOMERS
in  TRUNCATE
/demo/schema/sales_hist FIELDS TERMINATED BY '|'
ory/ OPTIONALLY ENCLOSED BY '"'
schema under (CUST_ID, CUST_FIRST_NAME, CUST_LAST_NAME,
$ORACLE_HOME
as cust1v3.ctl and
CUST_GENDER, CUST_YEAR_OF_BIRTH,
cust1v3.dat CUST_MARITAL_STATUS, CUST_STREET_ADDRESS,
CUST_POSTAL_CODE, CUST_CITY, CUST_CITY_ID,
BINDSIZE and CUST_STATE_PROVINCE, CUST_STATE_PROVINCE_ID,
READSIZE  COUNTRY_ID, CUST_MAIN_PHONE_NUMBER,
do not apply to Direct
Path Loads
CUST_INCOME_LEVEL, CUST_CREDIT_LIMIT, CUST_EMAIL,
CUST_TOTAL, CUST_TOTAL_ID, CUST_SRC_ID,
CUST_EFF_FROM DATE(19) "YYYY-MM-DD-HH24-MI-SS",
CUST_EFF_TO DATE(19) "YYYY-MM-DD-HH24-MI-SS",
CUST_VALID)
conn sh/sh

GRANT select ON customers TO uwclass;

conn uwclass/uwclass

CREATE TABLE customers AS
SELECT * FROM sh.customers
WHERE 1=2;

desc customers

-- run 1 - default sizing 


sqlldr userid=uwclass/uwclass control=c:\load\demo13.ctl
log=c:\load\demo13.log

Space allocated for bind array: 251252 bytes(46 rows)


Read buffer bytes: 1048576
Elapsed time was: 00:00:03.73
CPU time was: 00:00:01.25

-- run 2 - double default to 2M


sqlldr userid=uwclass/uwclass control=c:\load\demo13.ctl
log=c:\load\demo13.log readsize=2048000 bindsize=2048000 rows=64

Space allocated for bind array: 349568 bytes(64 rows)


Read buffer bytes: 2048000

Elapsed time was: 00:00:03.50

41
42

CPU time was: 00:00:01.09

-- run 3 - double default to 4M


sqlldr userid=uwclass/uwclass control=c:\load\demo13.ctl
log=c:\load\demo13.log readsize=4096000 bindsize=4096000 rows=64

Space allocated for bind array: 349568 bytes(64 rows)


Read buffer bytes: 4096000

Elapsed time was: 00:00:03.65


CPU time was: 00:00:01.12
 
Demo 14
Trailing
LOAD DATA
Load Numbers with INFILE *
trailing + and - signs
TRUNCATE
INTO TABLE loadnums (
col1 position(1:5),
col2 position(7:16) "TO_NUMBER(:col2,'99,999.99MI')")
BEGINDATA
abcde 1,234.99-
abcde 11,234.34+
abcde 45.23-
abcde 99,234.38-
abcde 23,234.23+
abcde 98,234.23+
sqlldr userid=uwclass/uwclass control=c:\load\demo14.ctl
log=c:\load\demo09.log
 

What is SQL*Loader and what is it used for?[edit]


SQL*Loader is a bulk loader utility used for moving data from external files into the Oracle database. Its syntax
is similar to that of the DB2 load utility, but comes with more options. SQL*Loader supports various load
formats, selective loading, and multi-table loads.
SQL*Loader (sqlldr) is the utility to use for high performance data loads. The data can be loaded from
any text file and inserted into the database.

How does one use the SQL*Loader utility?[edit]

42
43

One can load data into an Oracle database by using the sqlldr (sqlload on some platforms) utility. Invoke the
utility without arguments to get a list of available parameters. Look at the following example:

sqlldr username@server/password control=loader.ctl


sqlldr username/password@server control=loader.ctl

This sample control file (loader.ctl) will load an external data file containing delimited data:

load data
infile 'c:\data\mydata.csv'
into table emp
fields terminated by "," optionally enclosed by '"'
( empno, empname, sal, deptno )

The mydata.csv file may look like this:

10001,"Scott Tiger", 1000, 40


10002,"Frank Naude", 500, 20

Optionally, you can work with tabulation delimited files by using one of the following syntaxes:

fields terminated by "\t"


fields terminated by X'09'

Additionally, if your file was in Unicode, you could make the following addition.

load data
CHARACTERSET UTF16
infile 'c:\data\mydata.csv'
into table emp
fields terminated by "," optionally enclosed by '"'
( empno, empname, sal, deptno )

Another Sample control file with in-line data formatted as fix length records. The trick is to specify "*" as the
name of the data file, and use BEGINDATA to start the data section in the control file:

load data
infile *
replace
into table departments
( dept position (02:05) char(4),
deptname position (08:27) char(20)
)
begindata

43
44

COSC COMPUTER SCIENCE


ENGL ENGLISH LITERATURE
MATH MATHEMATICS
POLY POLITICAL SCIENCE

How does one load MS-Excel data into Oracle?[edit]


Open the MS-Excel spreadsheet and save it as a CSV (Comma Separated Values) file. This file can now be
copied to the Oracle machine and loaded using the SQL*Loader utility.
Possible problems and workarounds:
The spreadsheet may contain cells with newline characters (ALT+ENTER). SQL*Loader expects the entire
record to be on a single line. Run the following macro to remove newline characters (Tools -> Macro -> Visual
Basic Editor):

' Removing tabs and carriage returns from worksheet cells


Sub CleanUp()
Dim TheCell As Range
On Error Resume Next

For Each TheCell In ActiveSheet.UsedRange


With TheCell
If .HasFormula = False Then
.Value = Application.WorksheetFunction.Clean(.Value)
End If
End With
Next TheCell
End Sub

Tools:
If you need a utility to load Excel data into Oracle, download quickload from sourceforge
at http://sourceforge.net/projects/quickload

Is there a SQL*Unloader to download data to a flat file?[edit]


Oracle does not supply any data unload utilities. Here are some workarounds:
Using SQL*Plus
You can use SQL*Plus to select and format your data and then spool it to a file. This example spools out a
CSV (comma separated values) file that can be imported into MS-Excel:

set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on
spool oradata.txt
select col1 || ',' || col2 || ',' || col3
from tab1
where col2 = 'XYZ';
spool off

44
45

Warning: if your data contains a comma, choose another separator that is not in the data. You can also enclose
the column that contains the comma between ".
You can also use the "set colsep" command if you don't want to put the commas in by hand. This saves a lot of
typing. Example:

set colsep ','


set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on
spool oradata.txt
select col1, col2, col3
from tab1
where col2 = 'XYZ';
spool off

Using PL/SQL
PL/SQL's UTL_FILE package can also be used to unload data. Example:

declare
fp utl_file.file_type;
begin
fp := utl_file.fopen('c:\oradata','tab1.txt','w');
utl_file.putf(fp, '%s, %sn', 'TextField', 55);
utl_file.fclose(fp);
end;
/

Using Oracle SQL Developer


The freely downloadable Oracle SQL Developer application is capable of exporting data from Oracle tables in
numerous formats, like Excel, SQL insert statements, SQL loader format, HTML, XML, PDF, TEXT, Fixed text,
etc.
It can also import data from Excel (.xls), CSV (.csv), Text (.tsv) and DSV (.dsv) formats directly into a database.
Third-party programs
You might also want to investigate third party tools to help you unload data from Oracle. Here are some
examples:

 WisdomForce FastReader - http://www.wisdomforce.com


 IxUnload from ixionsoftware.com - http://www.ixionsoftware.com/products/
 FAst extraCT (FACT) for Oracle from CoSort - http://www.cosort.com/products/FACT
 Unicenter (also ManageIT or Platinum) Fast Unload for Oracle from CA
 dbForge Studio for Oracle from Devart
 Keeptool's Hora unload/load facility (part v5 to v6 upgrade) can export to formats such as Microsoft
Excel, DBF, XML, and text.
 TOAD from Quest
 SQLWays from Ispirer Systems
 PL/SQL Developer from allroundautomation
Can one load variable and fixed length data records?[edit]
Loading delimited (variable length) data
In the first example we will show how delimited (variable length) data can be loaded into Oracle:

45
46

LOAD DATA
INFILE *
INTO TABLE load_delimited_data
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
( data1,
data2
)
BEGINDATA
11111,AAAAAAAAAA
22222,"A,B,C,D,"

NOTE: The default data type in SQL*Loader is CHAR(255). To load character fields longer than 255
characters, code the type and length in your control file. By doing this, Oracle will allocate a big enough buffer
to hold the entire column, thus eliminating potential "Field in data file exceeds maximum length" errors.
Example:

...
resume char(4000),
...

Loading positional (fixed length) data


If you need to load positional data (fixed length), look at the following control file example:

LOAD DATA
INFILE *
INTO TABLE load_positional_data
( data1 POSITION(1:5),
data2 POSITION(6:15)
)
BEGINDATA
11111AAAAAAAAAA
22222BBBBBBBBBB

For example, position(01:05) will give the 1st to the 5th character (11111 and 22222).

Can one skip header records while loading?[edit]


One can skip unwanted header records or continue an interrupted load (for example if you run out of space) by
specifying the "SKIP=n" keyword. "n" specifies the number of logical rows to skip. Look at these examples:

OPTIONS (SKIP=5)
LOAD DATA
INFILE *
INTO TABLE load_positional_data
( data1 POSITION(1:5),

46
47

data2 POSITION(6:15)
)
BEGINDATA
11111AAAAAAAAAA
22222BBBBBBBBBB
...
sqlldr userid=ora_id/ora_passwd control=control_file_name.ctl skip=4

If you are continuing a multiple table direct path load, you may need to use the CONTINUE_LOAD clause
instead of the SKIP parameter. CONTINUE_LOAD allows you to specify a different number of rows to skip for
each of the tables you are loading.

Can one modify data as the database gets loaded?[edit]


Data can be modified as it loads into the Oracle Database. One can also populate columns with static or
derived values. However, this only applies for the conventional load path (and not for direct path loads). Here
are some examples:

LOAD DATA
INFILE *
INTO TABLE modified_data
( rec_no "my_db_sequence.nextval",
region CONSTANT '31',
time_loaded "to_char(SYSDATE, 'HH24:MI')",
data1 POSITION(1:5) ":data1/100",
data2 POSITION(6:15) "upper(:data2)",
data3 POSITION(16:22)"to_date(:data3, 'YYMMDD')"
)
BEGINDATA
11111AAAAAAAAAA991201
22222BBBBBBBBBB990112
LOAD DATA
INFILE 'mail_orders.txt'
BADFILE 'bad_orders.txt'
APPEND
INTO TABLE mailing_list
FIELDS TERMINATED BY ","
( addr,
city,
state,
zipcode,
mailing_addr "decode(:mailing_addr, null, :addr, :mailing_addr)",
mailing_city "decode(:mailing_city, null, :city, :mailing_city)",
mailing_state,
move_date "substr(:move_date, 3, 2) || substr(:move_date, 7, 2)"
)

Can one load data from multiple files/ into multiple tables at once?[edit]

47
48

Loading from multiple input files


One can load from multiple input files provided they use the same record format by repeating the INFILE
clause. Here is an example:

LOAD DATA
INFILE file1.dat
INFILE file2.dat
INFILE file3.dat
APPEND
INTO TABLE emp
( empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
deptno POSITION(17:18) CHAR,
mgr POSITION(20:23) INTEGER EXTERNAL
)

Loading into multiple tables


One can also specify multiple "INTO TABLE" clauses in the SQL*Loader control file to load into multiple tables.
Look at the following example:

LOAD DATA
INFILE *
INTO TABLE tab1 WHEN tab = 'tab1'
( tab FILLER CHAR(4),
col1 INTEGER
)
INTO TABLE tab2 WHEN tab = 'tab2'
( tab FILLER POSITION(1:4),
col1 INTEGER
)
BEGINDATA
tab1|1
tab1|2
tab2|2
tab3|3

The "tab" field is marked as a FILLER as we don't want to load it.


Note the use of "POSITION" on the second routing value (tab = 'tab2'). By default field scanning doesn't start
over from the beginning of the record for new INTO TABLE clauses. Instead, scanning continues where it left
off. POSITION is needed to reset the pointer to the beginning of the record again. In delimited formats, use
"POSITION(1)" after the first column to reset the pointer.
Another example:

LOAD DATA
INFILE 'mydata.dat'
REPLACE

48
49

INTO TABLE emp


WHEN empno != ' '
( empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
deptno POSITION(17:18) CHAR,
mgr POSITION(20:23) INTEGER EXTERNAL
)
INTO TABLE proj
WHEN projno != ' '
( projno POSITION(25:27) INTEGER EXTERNAL,
empno POSITION(1:4) INTEGER EXTERNAL
)

Can one selectively load only the records that one needs?[edit]
Look at this example, (01) is the first character, (30:37) are characters 30 to 37:

LOAD DATA
INFILE 'mydata.dat' BADFILE 'mydata.bad' DISCARDFILE 'mydata.dis'
APPEND
INTO TABLE my_selective_table
WHEN (01) <> 'H' and (01) <> 'T' and (30:37) = '20031217'
(
region CONSTANT '31',
service_key POSITION(01:11) INTEGER EXTERNAL,
call_b_no POSITION(12:29) CHAR
)

NOTE: SQL*Loader does not allow the use of OR in the WHEN clause. You can only use AND as in the
example above! To workaround this problem, code multiple "INTO TABLE ... WHEN" clauses. Here is an
example:

LOAD DATA
INFILE 'mydata.dat' BADFILE 'mydata.bad' DISCARDFILE 'mydata.dis'
APPEND
INTO TABLE my_selective_table
WHEN (01) <> 'H' and (01) <> 'T'
(
region CONSTANT '31',
service_key POSITION(01:11) INTEGER EXTERNAL,
call_b_no POSITION(12:29) CHAR
)
INTO TABLE my_selective_table
WHEN (30:37) = '20031217'
(
region CONSTANT '31',

49
50

service_key POSITION(01:11) INTEGER EXTERNAL,


call_b_no POSITION(12:29) CHAR
)

Can one skip certain columns while loading data?[edit]


One cannot use POSITION(x:y) with delimited data. Luckily, from Oracle 8i one can specify FILLER columns.
FILLER columns are used to skip columns/fields in the load file, ignoring fields that one does not want. Look at
this example:

LOAD DATA
TRUNCATE INTO TABLE T1
FIELDS TERMINATED BY ','
( field1,
field2 FILLER,
field3
)

BOUNDFILLER (available with Oracle 9i and above) can be used if the skipped column's value will be required
later again. Here is an example:

LOAD DATA
INFILE *
TRUNCATE INTO TABLE sometable
FIELDS TERMINATED BY "," trailing nullcols
(
c1,
field2 BOUNDFILLER,
field3 BOUNDFILLER,
field4 BOUNDFILLER,
field5 BOUNDFILLER,
c2 ":field2 || :field3",
c3 ":field4 + :field5"
)

How does one load multi-line records?[edit]


One can create one logical record from multiple physical records using one of the following two clauses:

 CONCATENATE - use when SQL*Loader should combine the same number of physical records
together to form one logical record.
 CONTINUEIF - use if a condition indicates that multiple records should be treated as one. Eg. by
having a '#' character in column 1.
How does one load records with multi-line fields?[edit]
Using Stream Record format, you can define a record delimiter, so that you're allowed to have the default
delimiter ('\n') in the field's content.
After the INFILE clause set the delimiter:

50
51

load data
infile "test.dat" "str '|\n'"
into test_table
fields terminated by ';' TRAILING NULLCOLS
(
desc,
txt
)

test.dat:

one line;hello dear world;|


two lines;Dear world,
hello!;|

Note that this doesn't seem to work with inline data (INFILE * and BEGINDATA).

How can one get SQL*Loader to COMMIT only at the end of the load
file?[edit]
One cannot, but by setting the ROWS= parameter to a large value, committing can be reduced. Make sure you
have big rollback segments ready when you use a high value for ROWS=.

Can one improve the performance of SQL*Loader?[edit]


 A very simple but easily overlooked hint is not to have any indexes and/or constraints (primary key) on
your load tables during the load process. This will significantly slow down load times even with ROWS= set
to a high value.

 Add the following option in the command line: DIRECT=TRUE. This will effectively bypass most of the
RDBMS processing. However, there are cases when you can't use direct load. For details, refer to the
FAQ about the differences between the conventional and direct path loader below.

 Turn off database logging by specifying the UNRECOVERABLE option. This option can only be used
with direct data loads.

 Run multiple load jobs concurrently.


What is the difference between the conventional and direct path loader?
[edit]
The conventional path loader essentially loads the data by using standard INSERT statements. The direct path
loader (DIRECT=TRUE) bypasses much of the logic involved with that, and loads directly into the Oracle data
files. More information about the restrictions of direct path loading can be obtained from the Oracle Server
Utilities Guide.
Some of the restrictions with direct path loads are:

 Loaded data will not be replicated


 Cannot always use SQL strings for column processing in the control file (something like this will
probably fail: col1 date "ddmonyyyy" "substr(:period,1,9)"). Details are in Metalink Note:230120.1.

51
52

How does one use SQL*Loader to load images, sound clips and
documents?[edit]
SQL*Loader can load data from a "primary data file", SDF (Secondary Data file - for loading nested tables and
VARRAYs) or LOBFILE. The LOBFILE method provides an easy way to load documents, photos, images and
audio clips into BLOB and CLOB columns. Look at this example:
Given the following table:

CREATE TABLE image_table (


image_id NUMBER(5),
file_name VARCHAR2(30),
image_data BLOB);

Control File:

LOAD DATA
INFILE *
INTO TABLE image_table
REPLACE
FIELDS TERMINATED BY ','
(
image_id INTEGER(5),
file_name CHAR(30),
image_data LOBFILE (file_name) TERMINATED BY EOF
)
BEGINDATA
001,image1.gif
002,image2.jpg
003,image3.jpg

How does one load EBCDIC data?[edit]


Specify the character set WE8EBCDIC500 for the EBCDIC data. The following example shows the
SQL*Loader controlfile to load a fixed length EBCDIC record into the Oracle Database:

LOAD DATA
CHARACTERSET WE8EBCDIC500
INFILE data.ebc "fix 86 buffers 1024"
BADFILE data.bad'
DISCARDFILE data.dsc'
REPLACE
INTO TABLE temp_data
(
field1 POSITION (1:4) INTEGER EXTERNAL,
field2 POSITION (5:6) INTEGER EXTERNAL,
field3 POSITION (7:12) INTEGER EXTERNAL,
field4 POSITION (13:42) CHAR,

52
53

field5 POSITION (43:72) CHAR,


field6 POSITION (73:73) INTEGER EXTERNAL,
field7 POSITION (74:74) INTEGER EXTERNAL,
field8 POSITION (75:75) INTEGER EXTERNAL,
field9 POSITION (76:86) INTEGER EXTERNAL
)

53

You might also like