SQL Loader 31 May 2019
SQL Loader 31 May 2019
SQL LOADER utility is used to load data from other data source into Oracle. For example, if you
have a table in FOXPRO, ACCESS or SYBASE or any other third party database, you can use SQL
Loader to load the data into Oracle Tables. SQL Loader will only read the data from Flat files. So
If you want to load the data from Foxpro or any other database, you have to first convert that
data into Delimited Format flat file or Fixed length format flat file, and then use SQL loader to
load the data into Oracle.
Following is procedure to load the data from Third Party Database into Oracle using SQL Loader.
1. Convert the Data into Flat file using third party database command.
2. Create the Table Structure in Oracle Database using appropriate datatypes
3. Write a Control File, describing how to interpret the flat file and options to load the
data.
4. Execute SQL Loader utility specifying the control file in the command line argument
Suppose you have a table in MS-ACCESS by name EMP, running under Windows O/S, with the
following structure
EMPNO INTEGER
NAME TEXT(50)
SAL CURRENCY
JDATE DATE
This table contains some 10,000 rows. Now you want to load the data from this table into an
Oracle Table. Oracle Database is running in LINUX O/S.
Solution
Step 1
Start MS-Access and convert the table into comma delimited flat (popularly known as csv) ,
by clicking on File/Save As menu. Let the delimited file name be emp.csv
FTP will then prompt you for username and password to connect to the Linux
1
2
For example:-
c. C:\> ftp 200.200.100.111
Name: oracle
Password: oracle
FTP>
d. Now give PUT command to transfer file from current Windows machine to Linux
machine
e. FTP>put
Local file:C:\>emp.csv
remote-file:/u01/oracle/emp.csv
f.
g. File transferred in 0.29 Seconds
FTP>
h. Now after the file is transferred quit the FTP utility by typing bye command.
FTP>bye
Good-Bye
Step 2
Now come to the Linux Machine and create a table in Oracle with the same structure as in
MS-ACCESS by taking appropriate datatypes. For example, create a table like this
$ sqlplus scott/tiger
SQL> CREATE TABLE emp (empno number(5),
name varchar2(50),
sal number(10,2),
jdate date);
Step 3
After creating the table, you have to write a control file describing the actions which SQL
Loader should do. You can use any text editor to write the control file. Now let us write a
controlfile for our case study
$ vi emp.ctl
1 LOAD DATA
2
3
2 INFILE ‘/u01/oracle/emp.csv’
3 BADFILE ‘/u01/oracle/emp.bad’
4 DISCARDFILE ‘/u01/oracle/emp.dsc’
5 INSERT INTO TABLE emp
6 FIELDS TERMINATED BY “,” OPTIONALLY ENCLOSED BY ‘”’ TRAILING NULLCOLS
7 (empno,name,sal,jdate date ‘mm/dd/yyyy’)
Notes: (Do not write the line numbers, they are meant for explanation purpose)
1. The LOAD DATA statement is required at the beginning of the control file.
2. The INFILE option specifies where the input file is located
3. Specifying BADFILE is optional. If you specify, then bad records found during loading
will be stored in this file.
4. Specifying DISCARDFILE is optional. If you specify, then records which do not meet a
WHEN condition will be written to this file.
5. You can use any of the following loading option
i. INSERT : Loads rows only if the target table is empty
ii. APPEND: Load rows if the target table is empty or not.
iii. REPLACE: First deletes all the rows in the existing table and then, load rows.
iv. TRUNCATE: First truncates the table and then load rows.
6. This line indicates how the fields are separated in input file. Since in our case the
fields are separated by “,” so we have specified “,” as the terminating char for fields. You
can replace this by any char which is used to terminate fields. Some of the popularly use
terminating characters are semicolon “;”, colon “:”, pipe “|” etc. TRAILING NULLCOLS
means if the last column is null then treat this as null value, otherwise, SQL LOADER will
treat the record as bad if the last column is null.
7. In this line specify the columns of the target table. Note how do you specify format
for Date columns
Step 4
After you have wrote the control file save it and then, call SQL Loader utility by typing the
following command
After you have executed the above command SQL Loader will shows you the output describing
how many rows it has loaded.
The LOG option of sqlldr specifies where the log file of this sql loader session should be created.
The log file contains all actions which SQL loader has performed i.e. how many rows were
loaded, how many were rejected and how much time is taken to load the rows and etc. You
have to view this file for any errors encountered while running SQL Loader.
3
4
SQL Loader can load the data into Oracle database using Conventional Path method or Direct
Path method. You can specify the method by using DIRECT command line option. If you give
DIRECT=TRUE then SQL loader will use Direct Path Loading otherwise, if omit this option or
specify DIRECT=false, then SQL Loader will use Conventional Path loading method.
Conventional Path
Conventional path load (the default) uses the SQL INSERT statement and a bind array buffer to
load data into database tables.
When SQL*Loader performs a conventional path load, it competes equally with all other
processes for buffer resources. This can slow the load significantly. Extra overhead is added as
SQL statements are generated, passed to Oracle, and executed.
The Oracle database looks for partially filled blocks and attempts to fill them on each insert.
Although appropriate during normal use, this can slow bulk loads dramatically.
Direct Path
In Direct Path Loading, Oracle will not use SQL INSERT statement for loading rows. Instead it
directly writes the rows, into fresh blocks beyond High Water Mark, in datafiles i.e. it does not
scan for free blocks before high water mark. Direct Path load is very fast because
Partial blocks are not used, so no reads are needed to find them, and fewer writes are
performed.
SQL*Loader need not execute any SQL INSERT statements; therefore, the processing
load on the Oracle database is reduced.
A direct path load calls on Oracle to lock tables and indexes at the start of the load and
releases them when the load is finished. A conventional path load calls Oracle once for
each array of rows to process a SQL INSERT statement.
A direct path load uses multiblock asynchronous I/O for writes to the database files.
During a direct path load, processes perform their own write I/O, instead of using
Oracle's buffer cache. This minimizes contention with other Oracle users.
The following conditions must be satisfied for you to use the direct path load method:
4
5
Tweet
This article provides 10 practical examples on how to upload data from a flat file to
Oracle tables.
5
6
$ cat employee.txt
100,Thomas,Sales,5000
200,Jason,Technology,5500
300,Mayla,Technology,7000
400,Nisha,Marketing,9500
500,Randy,Technology,6000
501,Ritu,Accounting,5400
$ cat example1.ctl
load data
infile '/home/ramesh/employee.txt'
6
7
id integer,
name varchar2(10),
dept varchar2(15),
salary integer,
hiredon date
Next create the control file that explains what needs to be upload and where.
7
8
$ cat sqlldr-add-new.ctl
load data
infile '/home/ramesh/employee.txt'
Note: If you have the values inside the data file enclosed with double quote, use this in
your control file: fields terminated by “,” optionally enclosed by ‘”‘
Note: If you don’t have the table created, you’ll get the following error message:
You can pass the userid and password to the sqlldr command using any one of the
following format. As you see below, both of these will prompt you for control file
location, as it was not given in the command line.
$ sqlldr scott/tiger
8
9
(or)
$ sqlldr userid=scott/tiger
control =
Execute the sqlldr command to upload these new record to the empty table by specifying
both uid/pwd and the control file location as shown below.
9
10
This will create the output log file in the same name as the data file, but with the .log
extension (instead of .ctl). Partial output shown below.
$ cat sqlldr-add-new.log
Table EMPLOYEE:
10
11
If you are new to Oracle database, and like to install it, follow this Oracle 11g installation
guide.
$ vi newemployee.txt
600,Ritu,Accounting,5400
700,Jessica,Marketing,7800
If you create a similar control file like the previous example, you might get the following
error message.
SQL*Loader-601: For INSERT option, table must be empty. Error on table EMPLOYEE
The above indicates that the table should be empty before you can upload data using
sql*loader.
11
12
If you like to insert more data to the tables without having to delete the existing rows,
use the “append’ command as shown in the following control file.
$ vi sqlldr-append-more.ctl
load data
infile '/home/ramesh/newemployee.txt'
append
12
13
$ cat sqlldr-add-new-with-data.ctl
load data
infile *
13
14
begindata
100,Thomas,Sales,5000
200,Jason,Technology,5500
300,Mayla,Technology,7000
400,Nisha,Marketing,9500
500,Randy,Technology,6000
Note: The infile will say ‘*’ in this case, as there is no input data file name for this
example.
The following example has different delimiters ($ after name, ^ after department).
14
15
$ cat employee-date.txt
100,Thomas$Sales^5000,31-JAN-2008
200,Jason$Technology^5500,01-Feb-2005
300,Mayla$Technology^7000,10-Aug-2000
400,Nisha$Marketing^9500,12-Dec-2011
500,Randy$Technology^6000,01-JAN-2007
Create the following control file and indicate the field delimiters for each and every field
using “terminated by” as shown below.
$ cat sqlldr-date.ctl
load data
infile '/home/ramesh/employee-date.txt'
( id, name terminated by "$", dept terminated by "^", salary, hiredon DATE "dd-mon-
yyyy" )
15
16
For this example, let us use the following file which has data that are of fixed length. For
example, 1st three characters are always employee number, Next 5 characters are always
employee name, etc.
16
17
$ cat employee-fixed.txt
200JasonTechnology5500
300MaylaTechnology7000
400NishaTechnology9500
500RandyTechnology6000
Create the following control file, where you specific the position of each and every field
as shown below usig the “Position(start:end)” syntax.
$ cat sqlldr-fixed.ctl
load data
infile '/home/ramesh/employee-fixed.txt'
Load this fixed length data using the sqlldr as shown below.
17
18
id is incremented by 999 before uploading. i.e if the emp id is 100 in the data file, it will
be loaded as 1099
Convert the name to upper case and load it. This uses the upper function.
If the department contains the value “Technology” change it to “Techies”. This uses
decode function
$ cat sqlldr-change-data.ctl
18
19
load data
infile '/home/ramesh/employee.txt'
( id ":id+999",
name "upper(:name)",
salary
Load the data using this control file which will massage the data before uploading it.
Verify that the data got changed while loading as per our rules.
19
20
The following control file loads data from two different data files (employee.txt and
newemployee.txt) to the employee table.
$ sqlldr-add-multiple.ctl
load data
infile '/home/ramesh/employee.txt'
infile '/home/ramesh/newemployee.txt'
20
21
Load the data using this control file which will upload data from multiple data files as
shown below.
( id integer,
bonus integer
);
Create the employee-bonus.txt data file that contains the fields: id, name, department,
salary, bonus
$ cat employee-bonus.txt
21
22
Create the control file as shown below, which will upload the data from the above file to
two different tables. As shown below, you should have two “into table” commands, and
specify the position of the data which needs to be used to upload the data to that
column.
$ cat sqlldr-multiple-tables.ctl
load data
infile '/home/ramesh/employee-bonus.txt'
( id position(1:3),
name position(5:10),
dept position(12:21),
salary position(23:26))
22
23
( id position(1:3),
bonus position(28:31))
Load the data to multiple tables using this control file as shown below.
23
24
ID BONUS
---------- ----------
100 1000
200 2000
300 2000
400 1000
500 3000
$ cat employee-bad.txt
100,Thomas,Sales,5000
200,Jason,Technology,5500
300,Mayla,Technology,7K
24
25
400,Nisha,Marketing,9500
500,Randy,Technology,6K
$ cat sqlldr-bad.ctl
load data
infile '/home/ramesh/employee-bad.txt'
Load the data (including the invalid records) using this control file as shown below.
As you see from the abvoe output, it still says “logical record count 5”, but you should
check the log files to see if it has rejected any records.
The log file indicates that 2 records are rejected as shown below:
25
26
Table EMPLOYEE:
By default the rejected records are stored in a file that has the same name as the data file
(but with .bad extension)
$ cat employee-bad.bad
300,Mayla,Technology,7K
500,Randy,Technology,6K
As you see below, the employee table has only 3 records (as 2 of them were rejected).
26
27
Add the line “when” next to “into table” line. In the following control file, the when
clause indicates that it will load only the records that have dept as “Technology”.
$ cat sqlldr-when.ctl
load data
infile '/home/ramesh/employee.txt'
27
28
Load the selective data (only the “Technology” records) using this control file as shown
below.
As you see from the above output, it still says “logical record count 5”, but you should
check the log files to see how many records were loaded, and how many records were
discarded because it didn’t match the when condition.
The following from the log file shows that 5 records were read, and 2 of them were
discarded as it didn’t match the when condition.
Verify that only the selective records were loaded into the table.
28
29
@Rohit,
This is not possible using SQL loaders. Because data type of VendorId is numeric and
from infile you are getting characters. You have two approches here:
1) Either you process your infile first and replace X by 1, y by 2, z by 3. Then use SQL
loader using when condition to check what vendoe id is it and amount column should
take what value. OR
2) You need to use external tables and SQL functions in this scenario.
Refer:
http://asktom.oracle.com/pls/asktom/f?
p=100:11:0::::P11_QUESTION_ID:1710164700346004127
LINK
29
30
I tried using the boundfiller,decodes and my CONTROL file will look like this:
INFILE=’sale_exec.dat’
APPEND
PRESERVE BLANKS
INTO TABLE SALES
FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘”‘ TRAILING NULLCOLS
(
c1 BOUNDFILLER,
REPORT_DATE,
c2 BOUNDFILLER,
c3 BOUNDFILLER,
VENDORID “to_number(DECODE(:c1,’X’,1,’Y’,2,3))”,
AMOUNT “to_number(DECODE(:c1,’Z’,c3,c2))”
)
This worked good for me.
Regards,
Rohit K
I have a different scenario. I have a table with 5 columns, c1, c2, c3, c4, c5 and a csv file
has 6 columns, a,c1,c2,c3,c4,c5. I would like to load c1 to c5 columns data from the csv
file to c1 to c5 columns in the table. Can we skip columns any columns from csv or can
we map csv columns to table columns in the loader control file ?
LINK
30
31
I got it. Keeping FILLER keyword to the right of column name would skip the column:
For e.g.:
OPTIONS (SKIP=1)
LOAD DATA
INFILE ‘source.csv’
BADFILE ‘source.csv.bad’
DISCARDFILE ‘source.csv.dsc’
APPEND
INTO TABLE tab_name
FIELDS TERMINATED BY ‘,’ OPTIONALLY ENCLOSED BY ‘”‘
TRAILING NULLCOLS
( a FILLER
,c1
,c2
,c3
,c4
,c5
)
General
31
32
32
33
READSIZE = n
RESUMABLE = {TRUE | FALSE}
RESUMABLE_NAME = 'text string'
RESUMABLE_TIMEOUT = n
ROWS = n
SILENT = {HEADER | FEEDBACK | ERRORS | DISCARDS |
PARTITIONS | ALL}
SKIP = n
SKIP_INDEX_MAINTENANCE = {TRUE | FALSE}
SKIP_UNUSABLE_INDEXES = {TRUE | FALSE}
STREAMSIZE = n
OPTIONS (BINDSIZE=100000, SILENT=(ERRORS, FEEDBACK))
CONVENTIONAL PATH
DIRECT PATH
PATHS
All loads demonstrated below are convention with the exception of
demo 6.
Comm
','
TERMINATORS a
Tab 0x'09'
TRAILING NULLCOLS
-- assuming this data
10 Accounting
-- the following
INTO TABLE dept
TRAILING NULLCOLS TRAILING NULLCOLS
( deptno CHAR TERMINATED BY " ",
dname CHAR TERMINATED BY WHITESPACE,
loc CHAR TERMINATED BY WHITESPACE)
33
34
value
CONTINUEIF THIS (1:2) = '%%'
CREATE TABLE emp (
empno NUMBER(4),
ename VARCHAR2(10),
job VARCHAR2(10),
mgr NUMBER(4),
hiredate DATE,
sal NUMBER(8,2),
comm NUMBER(7,2),
deptno NUMBER(2),
projno NUMBER(4),
loadseq NUMBER(3));
CREATE TABLE proj (
emp NUMBER(4),
projno NUMBER(3));
CREATE TABLE funcdemo (
last_name VARCHAR2(20),
first_name VARCHAR2(20));
CREATE TABLE decodemo (
fld1 VARCHAR2(20),
fld2 VARCHAR2(20));
34
35
CREATE TABLE denver_prj (
projno VARCHAR2(3),
empno NUMBER(5),
projhrs NUMBER(2));
CREATE TABLE orlando_prj (
projno VARCHAR2(3),
empno NUMBER(5),
projhrs NUMBER(2));
CREATE TABLE misc_prj (
projno VARCHAR2(3),
empno NUMBER(5),
projhrs NUMBER(2));
CREATE TABLE po_tab OF XMLTYPE;
CREATE TABLE loadnums(
col1 VARCHAR2(10),
col2 NUMBER);
Demo 1
Basic import of delimited data with data in the control file
OPTIONS (ERRORS=500, SILENT=(FEEDBACK))
LOAD DATA
INFILE *
INTO TABLE <table_name>
Control File FIELDS TERMINATED BY <delimiter>
OPTIONALLY ENCLOSED BY <enclosing character>
(<column_name>, <column_name>, <column_name>)
sqlldr userid=uwclass/uwclass control=c:\load\demo01.ctl
log=d:\load\demo01.log
Demo 2
Basic import of fixed length data with separate data and control files
Control File LOAD DATA
INFILE <data_file_path_and_name>
Data File
INTO TABLE <table_name> (
<column_name> POSITION(<integer>:<integer>) <data_type>,
<column_name> POSITION(<integer>:<integer>) <data_type>,
<column_name> POSITION(<integer>:<integer>) <data_type>)
sqlldr userid=uwclass/uwclass control=c:\load\demo02.ctl
35
36
log=c:\load\demo02.log
Demo 3
Append of delimited data with data in the control file. This sample demonstrates date formating,
delimiters within delimiters and implementation of record numbering with a SQL*Loader sequence.
APPEND indicates that the table need not be empty before the SQL*Loader is run.
LOAD DATA
INFILE *
APPEND
INTO TABLE emp
FIELDS TERMINATED BY ","
Control File OPTIONALLY ENCLOSED BY '"'
(<column_name>, <column_name> DATE "DD-Month-YYYY",
<column_name> CHAR TERMINATED BY ':',
<column_name> SEQUENCE(MAX,1))
sqlldr userid=uwclass/uwclass control=c:\load\demo03.ctl
log=c:\load\demo3.log
Demo 4
Replace of fixed length data with separate data and control file. This sample demonstrates specifying a
discard file, the maximum number of records to discard (DISCARDMAX), and CONTINUEIF ( where it
looks for an asterisk in the first position to determine if a new line has started.
LOAD DATA
INFILE 'c: emp\demo04.dat'
DISCARDFILE 'c: emp\demo4.dsc'
DISCARDMAX 999
REPLACE
Control File CONTINUEIF THIS (1) = '*'
INTO TABLE emp (
Data File
empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
hiredate POSITION(52:60) INTEGER EXTERNAL)
sqlldr userid=uwclass/uwclass control=c:\load\demo04.ctl
log=c:\load\demo4.log
Demo 5
Loading into multiple tables during an import using the WHEN keyword. The control file loads two different
tables making three passes at one of them. Note the problem with the Doolittle record and how it is handled.
Control File LOAD DATA
INFILE 'c: emp\demo05.dat'
Data File
BADFILE 'c: emp\bad05.bad'
DISCARDFILE 'c: emp\disc05.dsc'
REPLACE
36
37
INTO TABLE emp (
empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
deptno POSITION(17:18) CHAR,
mgr POSITION(20:23) INTEGER EXTERNAL)
-- 2nd project
INTO TABLE proj
WHEN projno != ' ' (
emp POSITION(1:4) INTEGER EXTERNAL,
projno POSITION(29:31) INTEGER EXTERNAL)
-- 3rd project
INTO TABLE proj
WHEN projno != ' ' (
emp POSITION(1:4) INTEGER EXTERNAL,
projno POSITION(33:35) INTEGER EXTERNAL)
sqlldr userid=uwclass/uwclass control=c:\load\demo5.ctl
log=d:\load\demo5.log
Demo 6
Using the NULLIF and BLANKS keywords to handle zero length strings being loaded into numeric
columns. Also note the use of Direct Path Load in the control file (DIRECT=TRUE).
Control File LOAD DATA
INFILE 'c: emp\demo06.dat'
Data File
INSERT
INTO TABLE emp
-- SORTED INDEXES (emp_empno)
(
empno POSITION(01:04)
INTEGER EXTERNAL NULLIFempno=BLANKS,
ename POSITION(06:15) CHAR,
job POSITION(17:25) CHAR,
mgr POSITION(27:30)
INTEGER EXTERNAL NULLIF mgr=BLANKS,
sal POSITION(32:39) DECIMAL
EXTERNAL NULLIF sal=BLANKS,
comm POSITION(41:48) DECIMAL
37
38
EXTERNAL NULLIF comm=BLANKS,
deptno
POSITION(50:51) INTEGER EXTERNAL NULLIFdeptno=BLANKS)
sqlldr userid=uwclass/uwclass control=c:\load\demo06.ctl
log=c:\load\demo06.log DIRECT=TRUE
Demo 7
Using a buit-in function to modify data during loading
LOAD DATA
INFILE *
INSERT
INTO TABLE funcdemo
(
LAST_NAME position(1:7) CHAR "UPPER(:LAST_NAME)",
Control File FIRST_NAME position(8:15) CHAR "LOWER(:FIRST_NAME)"
)
BEGINDATA
Locke Phil
Gorman Tim
sqlldr userid=uwclass/uwclass control=c:\load\demo07.ctl
log=c:\load\demo07.log
Demo 8
Another example of using a built-in function, in this case DECODE, to modify data during loading
LOAD DATA
INFILE *
INSERT
INTO TABLE decodemo
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(
fld1,
Control File fld2 "DECODE(:fld1, 'hello', 'goodbye', :fld1)"
)
BEGINDATA
hello,""
goodbye,""
this is a test,""
hello,""
sqlldr userid=uwclass/uwclass control=c:\load\demo08.ctl
log=c:\load\demo08.log
38
39
Demo 9
Loading multiple files into multiple tables in a singe control file. Note the use of the WHEN keyword.
LOAD DATA
INFILE 'c: emp\demo09a.dat'
INFILE 'c: emp\demo09b.dat'
APPEND
INTO TABLE denver_prj
WHEN projno = '101' (
projno position(1:3) CHAR,
empno position(4:8) INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)
Control File
INTO TABLE orlando_prj
Data File WHEN projno = '202' (
projno position(1:3) CHAR,
Data File empno position(4:8) INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)
INTO TABLE misc_prj
WHEN projno != '101' AND projno != '202' (
projno position(1:3) CHAR,
empno position(4:8) INTEGER EXTERNAL,
projhrs position(9:10) INTEGER EXTERNAL)
sqlldr userid=uwclass/uwclass control=c:\load\demo09.ctl
log=c:\load\demo09.log
Demo 10
Loading negative numeric values. Note Clark and Miller's records in the data file. Note empty row
LOAD DATA
INFILE 'c: emp\demo10.dat'
INTO TABLE emp
REJECT ROWS WITH ALL NULL FIELDS
(
empno POSITION(01:04) INTEGER EXTERNAL,
Control File ename POSITION(06:15) CHAR,
job POSITION(17:25) CHAR,
Data File
mgr POSITION(27:30) INTEGER EXTERNAL,
sal POSITION(32:39) DECIMAL EXTERNAL,
comm POSITION(41:48) DECIMAL EXTERNAL,
deptno POSITION(50:51) INTEGER EXTERNAL)
sqlldr userid=uwclass/uwclass control=c:\load\demo10.ctl
log=c:\load\demo10.log
39
40
Demo 11
Loading XML
LOAD DATA
INFILE *
INTO TABLE po_tab
APPEND
XMLTYPE (xmldata)
FIELDS
(xmldata CHAR(2000))
desc po_tab
SELECT * FROM po_tab;
SELECT *
FROM po_tab
WHERE sys_nc_rowinfo$ LIKE '%Hurry%';
Demo 12
Loading a CONSTANT, RECNUM, and SYSDATE
OPTIONS (ERRORS=100, SILENT=(FEEDBACK))
LOAD DATA
INFILE *
REPLACE
INTO TABLE dept
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
(recno RECNUM, deptno CONSTANT "XX", dname, loc,
Control File
tdateSYSDATE)
ALTER TABLE dept
ADD (recno NUMBER(5), tdate DATE);
desc dept
40
41
LOAD DATA
The control file and data INFILE 'c: emp\cust1v3.dat'
for
this demo can be found
INTO TABLE CUSTOMERS
in TRUNCATE
/demo/schema/sales_hist FIELDS TERMINATED BY '|'
ory/ OPTIONALLY ENCLOSED BY '"'
schema under (CUST_ID, CUST_FIRST_NAME, CUST_LAST_NAME,
$ORACLE_HOME
as cust1v3.ctl and
CUST_GENDER, CUST_YEAR_OF_BIRTH,
cust1v3.dat CUST_MARITAL_STATUS, CUST_STREET_ADDRESS,
CUST_POSTAL_CODE, CUST_CITY, CUST_CITY_ID,
BINDSIZE and CUST_STATE_PROVINCE, CUST_STATE_PROVINCE_ID,
READSIZE COUNTRY_ID, CUST_MAIN_PHONE_NUMBER,
do not apply to Direct
Path Loads
CUST_INCOME_LEVEL, CUST_CREDIT_LIMIT, CUST_EMAIL,
CUST_TOTAL, CUST_TOTAL_ID, CUST_SRC_ID,
CUST_EFF_FROM DATE(19) "YYYY-MM-DD-HH24-MI-SS",
CUST_EFF_TO DATE(19) "YYYY-MM-DD-HH24-MI-SS",
CUST_VALID)
conn sh/sh
conn uwclass/uwclass
CREATE TABLE customers AS
SELECT * FROM sh.customers
WHERE 1=2;
desc customers
41
42
42
43
One can load data into an Oracle database by using the sqlldr (sqlload on some platforms) utility. Invoke the
utility without arguments to get a list of available parameters. Look at the following example:
This sample control file (loader.ctl) will load an external data file containing delimited data:
load data
infile 'c:\data\mydata.csv'
into table emp
fields terminated by "," optionally enclosed by '"'
( empno, empname, sal, deptno )
Optionally, you can work with tabulation delimited files by using one of the following syntaxes:
Additionally, if your file was in Unicode, you could make the following addition.
load data
CHARACTERSET UTF16
infile 'c:\data\mydata.csv'
into table emp
fields terminated by "," optionally enclosed by '"'
( empno, empname, sal, deptno )
Another Sample control file with in-line data formatted as fix length records. The trick is to specify "*" as the
name of the data file, and use BEGINDATA to start the data section in the control file:
load data
infile *
replace
into table departments
( dept position (02:05) char(4),
deptname position (08:27) char(20)
)
begindata
43
44
Tools:
If you need a utility to load Excel data into Oracle, download quickload from sourceforge
at http://sourceforge.net/projects/quickload
set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on
spool oradata.txt
select col1 || ',' || col2 || ',' || col3
from tab1
where col2 = 'XYZ';
spool off
44
45
Warning: if your data contains a comma, choose another separator that is not in the data. You can also enclose
the column that contains the comma between ".
You can also use the "set colsep" command if you don't want to put the commas in by hand. This saves a lot of
typing. Example:
Using PL/SQL
PL/SQL's UTL_FILE package can also be used to unload data. Example:
declare
fp utl_file.file_type;
begin
fp := utl_file.fopen('c:\oradata','tab1.txt','w');
utl_file.putf(fp, '%s, %sn', 'TextField', 55);
utl_file.fclose(fp);
end;
/
45
46
LOAD DATA
INFILE *
INTO TABLE load_delimited_data
FIELDS TERMINATED BY "," OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
( data1,
data2
)
BEGINDATA
11111,AAAAAAAAAA
22222,"A,B,C,D,"
NOTE: The default data type in SQL*Loader is CHAR(255). To load character fields longer than 255
characters, code the type and length in your control file. By doing this, Oracle will allocate a big enough buffer
to hold the entire column, thus eliminating potential "Field in data file exceeds maximum length" errors.
Example:
...
resume char(4000),
...
LOAD DATA
INFILE *
INTO TABLE load_positional_data
( data1 POSITION(1:5),
data2 POSITION(6:15)
)
BEGINDATA
11111AAAAAAAAAA
22222BBBBBBBBBB
For example, position(01:05) will give the 1st to the 5th character (11111 and 22222).
OPTIONS (SKIP=5)
LOAD DATA
INFILE *
INTO TABLE load_positional_data
( data1 POSITION(1:5),
46
47
data2 POSITION(6:15)
)
BEGINDATA
11111AAAAAAAAAA
22222BBBBBBBBBB
...
sqlldr userid=ora_id/ora_passwd control=control_file_name.ctl skip=4
If you are continuing a multiple table direct path load, you may need to use the CONTINUE_LOAD clause
instead of the SKIP parameter. CONTINUE_LOAD allows you to specify a different number of rows to skip for
each of the tables you are loading.
LOAD DATA
INFILE *
INTO TABLE modified_data
( rec_no "my_db_sequence.nextval",
region CONSTANT '31',
time_loaded "to_char(SYSDATE, 'HH24:MI')",
data1 POSITION(1:5) ":data1/100",
data2 POSITION(6:15) "upper(:data2)",
data3 POSITION(16:22)"to_date(:data3, 'YYMMDD')"
)
BEGINDATA
11111AAAAAAAAAA991201
22222BBBBBBBBBB990112
LOAD DATA
INFILE 'mail_orders.txt'
BADFILE 'bad_orders.txt'
APPEND
INTO TABLE mailing_list
FIELDS TERMINATED BY ","
( addr,
city,
state,
zipcode,
mailing_addr "decode(:mailing_addr, null, :addr, :mailing_addr)",
mailing_city "decode(:mailing_city, null, :city, :mailing_city)",
mailing_state,
move_date "substr(:move_date, 3, 2) || substr(:move_date, 7, 2)"
)
Can one load data from multiple files/ into multiple tables at once?[edit]
47
48
LOAD DATA
INFILE file1.dat
INFILE file2.dat
INFILE file3.dat
APPEND
INTO TABLE emp
( empno POSITION(1:4) INTEGER EXTERNAL,
ename POSITION(6:15) CHAR,
deptno POSITION(17:18) CHAR,
mgr POSITION(20:23) INTEGER EXTERNAL
)
LOAD DATA
INFILE *
INTO TABLE tab1 WHEN tab = 'tab1'
( tab FILLER CHAR(4),
col1 INTEGER
)
INTO TABLE tab2 WHEN tab = 'tab2'
( tab FILLER POSITION(1:4),
col1 INTEGER
)
BEGINDATA
tab1|1
tab1|2
tab2|2
tab3|3
LOAD DATA
INFILE 'mydata.dat'
REPLACE
48
49
Can one selectively load only the records that one needs?[edit]
Look at this example, (01) is the first character, (30:37) are characters 30 to 37:
LOAD DATA
INFILE 'mydata.dat' BADFILE 'mydata.bad' DISCARDFILE 'mydata.dis'
APPEND
INTO TABLE my_selective_table
WHEN (01) <> 'H' and (01) <> 'T' and (30:37) = '20031217'
(
region CONSTANT '31',
service_key POSITION(01:11) INTEGER EXTERNAL,
call_b_no POSITION(12:29) CHAR
)
NOTE: SQL*Loader does not allow the use of OR in the WHEN clause. You can only use AND as in the
example above! To workaround this problem, code multiple "INTO TABLE ... WHEN" clauses. Here is an
example:
LOAD DATA
INFILE 'mydata.dat' BADFILE 'mydata.bad' DISCARDFILE 'mydata.dis'
APPEND
INTO TABLE my_selective_table
WHEN (01) <> 'H' and (01) <> 'T'
(
region CONSTANT '31',
service_key POSITION(01:11) INTEGER EXTERNAL,
call_b_no POSITION(12:29) CHAR
)
INTO TABLE my_selective_table
WHEN (30:37) = '20031217'
(
region CONSTANT '31',
49
50
LOAD DATA
TRUNCATE INTO TABLE T1
FIELDS TERMINATED BY ','
( field1,
field2 FILLER,
field3
)
BOUNDFILLER (available with Oracle 9i and above) can be used if the skipped column's value will be required
later again. Here is an example:
LOAD DATA
INFILE *
TRUNCATE INTO TABLE sometable
FIELDS TERMINATED BY "," trailing nullcols
(
c1,
field2 BOUNDFILLER,
field3 BOUNDFILLER,
field4 BOUNDFILLER,
field5 BOUNDFILLER,
c2 ":field2 || :field3",
c3 ":field4 + :field5"
)
CONCATENATE - use when SQL*Loader should combine the same number of physical records
together to form one logical record.
CONTINUEIF - use if a condition indicates that multiple records should be treated as one. Eg. by
having a '#' character in column 1.
How does one load records with multi-line fields?[edit]
Using Stream Record format, you can define a record delimiter, so that you're allowed to have the default
delimiter ('\n') in the field's content.
After the INFILE clause set the delimiter:
50
51
load data
infile "test.dat" "str '|\n'"
into test_table
fields terminated by ';' TRAILING NULLCOLS
(
desc,
txt
)
test.dat:
Note that this doesn't seem to work with inline data (INFILE * and BEGINDATA).
How can one get SQL*Loader to COMMIT only at the end of the load
file?[edit]
One cannot, but by setting the ROWS= parameter to a large value, committing can be reduced. Make sure you
have big rollback segments ready when you use a high value for ROWS=.
Add the following option in the command line: DIRECT=TRUE. This will effectively bypass most of the
RDBMS processing. However, there are cases when you can't use direct load. For details, refer to the
FAQ about the differences between the conventional and direct path loader below.
Turn off database logging by specifying the UNRECOVERABLE option. This option can only be used
with direct data loads.
51
52
How does one use SQL*Loader to load images, sound clips and
documents?[edit]
SQL*Loader can load data from a "primary data file", SDF (Secondary Data file - for loading nested tables and
VARRAYs) or LOBFILE. The LOBFILE method provides an easy way to load documents, photos, images and
audio clips into BLOB and CLOB columns. Look at this example:
Given the following table:
Control File:
LOAD DATA
INFILE *
INTO TABLE image_table
REPLACE
FIELDS TERMINATED BY ','
(
image_id INTEGER(5),
file_name CHAR(30),
image_data LOBFILE (file_name) TERMINATED BY EOF
)
BEGINDATA
001,image1.gif
002,image2.jpg
003,image3.jpg
LOAD DATA
CHARACTERSET WE8EBCDIC500
INFILE data.ebc "fix 86 buffers 1024"
BADFILE data.bad'
DISCARDFILE data.dsc'
REPLACE
INTO TABLE temp_data
(
field1 POSITION (1:4) INTEGER EXTERNAL,
field2 POSITION (5:6) INTEGER EXTERNAL,
field3 POSITION (7:12) INTEGER EXTERNAL,
field4 POSITION (13:42) CHAR,
52
53
53