Oracle Data Pump in Oracle Database 10g: Getting Started
Oracle Data Pump in Oracle Database 10g: Getting Started
Oracle Data Pump in Oracle Database 10g: Getting Started
Oracle Data Pump is a newer, faster and more flexible alternative to the "exp" and "imp" utilities used in
previous Oracle versions. In addition to basic import and export functionality data pump provides a
PL/SQL API and support for external tables.
Getting Started
Table Exports/Imports
Schema Exports/Imports
Database Exports/Imports
Miscellaneous Information
External Tables
Help
o expdp
o impdp
Getting Started
For the examples to work we must first unlock the SCOTT account and create a directory object it can
access. The directory object is only a pointer to a physical directory, creating it does not actually create
the physical directory on the file system.
CONN / AS SYSDBA
ALTER USER scott IDENTIFIED BY tiger ACCOUNT UNLOCK;
Table Exports/Imports
The TABLES parameter is used to specify the tables that are to be exported. The following is an example
of the table export and import syntax.
Schema Exports/Imports
The OWNER parameter of exp has been replaced by the SCHEMAS parameter which is used to specify the
schemas to be exported. The following is an example of the schema export and import syntax.
Database Exports/Imports
The FULL parameter indicates that a complete database export is required. The following is an example
of the full database export and import syntax.
INCLUDE=object_type[:name_clause] [, ...]
EXCLUDE=object_type[:name_clause] [, ...]
The following code shows how they can be used as command line parameters.
expdp scott/tiger@db10g schemas=SCOTT include=TABLE:"IN ('EMP', 'DEPT')"
directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log
A single import/export can include multiple references to the parameters, so to export tables, views and
some packages we could use either of the following approaches.
INCLUDE=TABLE,VIEW,PACKAGE:"LIKE '%API'"
or
INCLUDE=TABLE
INCLUDE=VIEW
INCLUDE=PACKAGE:"LIKE '%API'"
CONN / AS SYSDBA
GRANT CREATE DATABASE LINK TO test;
CONN test/test
CREATE DATABASE LINK remote_scott CONNECT TO scott IDENTIFIED BY tiger USING
'DEV';
In the case of exports, the NETWORK_LINK parameter identifies the database link pointing to the source
server. The objects are exported from the source server in the normal manner, but written to a directory
object on the local server, rather than one on the source server. Both the local and remote users require
theEXP_FULL_DATABASE role granted to them.
For imports, the NETWORK_LINK parameter also identifies the database link pointing to the source server.
The difference here is the objects are imported directly from the source into the local server without being
written to a dump file. Although there is no need for a DUMPFILE parameter, a directory object is still
required for the logs associated with the operation. Both the local and remote users require
the IMP_FULL_DATABASE role granted to them.
Unlike the original exp and imp utilities all data pump ".dmp" and ".log" files are created on the Oracle
server, not the client machine.
All data pump actions are performed by multiple jobs (server processes not DBMS_JOB jobs). These jobs
are controlled by a master control process which uses Advanced Queuing. At runtime an advanced queue
table, named after the job name, is created and used by the master control process. The table is dropped
on completion of the data pump job. The job and the advanced queue can be named using
the JOB_NAME parameter. Cancelling the client process does not stop the associated data pump job.
Issuing "ctrl+c" on the client during a job stops the client output and presents a command prompt. Typing
"status" at this prompt allows you to monitor the current job.
Export> status
Job: SYS_EXPORT_FULL_01
Operation: EXPORT
Mode: FULL
State: EXECUTING
Bytes Processed: 0
Current Parallelism: 1
Job Error Count: 0
Dump File: D:TEMPDB10G.DMP
bytes written: 4,096
Worker 1 Status:
State: EXECUTING
Object Schema: SYSMAN
Object Name: MGMT_CONTAINER_CRED_ARRAY
Object Type: DATABASE_EXPORT/SCHEMA/TYPE/TYPE_SPEC
Completed Objects: 261
Total Objects: 261
Data pump performance can be improved by using the PARALLEL parameter. This should be used in
conjunction with the "%U" wildcard in the DUMPFILEparameter to allow multiple dumpfiles to be created
or read.
Along with the data pump utilities Oracle provide an PL/SQL API. The following is an example of how this
API can be used to perform a schema export.
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'SCOTT.dmp',
directory => 'TEST_DIR');
DBMS_DATAPUMP.add_file(
handle => l_dp_handle,
filename => 'SCOTT.log',
directory => 'TEST_DIR',
filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
DBMS_DATAPUMP.metadata_filter(
handle => l_dp_handle,
name => 'SCHEMA_EXPR',
value => '= ''SCOTT''');
DBMS_DATAPUMP.start_job(l_dp_handle);
DBMS_DATAPUMP.detach(l_dp_handle);
END;
/
Once the job has started the status can be checked using.
External Tables
Oracle have incorporated support for data pump technology into external tables. The
ORACLE_DATAPUMP access driver can be used to unload data to data pump export files and
subsequently reload it. The unload of data occurs when the external table is created using the "AS"
clause.
The syntax to create the external table pointing to an existing file is similar, but without the "AS" clause.
Help
expdp
expdp help=y
You can control how Export runs by entering the 'expdp' command followed
by various parameters. To specify parameters, you use keywords:
Command Description
------------------------------------------------------------------------------
ADD_FILE Add dumpfile to dumpfile set.
ADD_FILE=dumpfile-name
CONTINUE_CLIENT Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT Quit client session and leave job running.
HELP Summarize interactive commands.
KILL_JOB Detach and delete job.
PARALLEL Change the number of active workers for current job.
PARALLEL=.
START_JOB Start/resume current job.
STATUS Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
STATUS=[interval]
STOP_JOB Orderly shutdown of job execution and exits the client.
STOP_JOB=IMMEDIATE performs an immediate shutdown of the
Data Pump job.
impdp
impdp help=y
The Data Pump Import utility provides a mechanism for transferring data
objects
between Oracle databases. The utility is invoked with the following command:
You can control how Import runs by entering the 'impdp' command followed
by various parameters. To specify parameters, you use keywords: