0% found this document useful (0 votes)
171 views41 pages

Datapump: Utility To Export/Import Data Gary M. Noble LDS Church

Datapump is a utility that can export and import data and metadata from Oracle databases. It has several advantages over the traditional Export/Import utilities, such as being faster, supporting new Oracle data types, and having additional features like parallel processing. Datapump runs as part of the database instance, unlike Export/Import which run as client processes. It uses command line clients and APIs to control export and import jobs.

Uploaded by

Rajasree dats
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
171 views41 pages

Datapump: Utility To Export/Import Data Gary M. Noble LDS Church

Datapump is a utility that can export and import data and metadata from Oracle databases. It has several advantages over the traditional Export/Import utilities, such as being faster, supporting new Oracle data types, and having additional features like parallel processing. Datapump runs as part of the database instance, unlike Export/Import which run as client processes. It uses command line clients and APIs to control export and import jobs.

Uploaded by

Rajasree dats
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 41

DATAPUMP

UTILITY TO EXPORT/IMPORT DATA


Gary M. Noble
LDS Church
Background
• Experience with Oracle Databases
• Familiar with Export/Import Utility
• RMAN backups
• Datapump to replace old Export/Import
Introduction
• More features than the standard
export/import
• Use in addition to RMAN backups
• Another means to upgrade to a higher version
of Oracle
Why Use Datapump
• DataPump handles Oracle 10g data types.
• DataPump features
• DataPump speed
Datapump Components
• Command line client expdp
• Command line client impdp
• DBMS_DATAPUMP (Datapump API)
• DBMS_METADATA (Metadata API)
Outline
• Datapump is faster than standard
export/import
• Setup for datapump export
• Setup for datapump import
• Datapump features
• Experiences
• Kill the job
Datapump Speed
• Standard Export and Import utilities ran as
clients
• Datapump is run as part of the database
instance on the database server
• Datapump can do parallel work
– Create multiple thread processes
– Create multiple data files (file sets)
Datapump Control
• Master Control Process
• Master Table
• Worker Process
Major Features
• PARALLEL – maximum number of threads
• START_JOB – ability to restart a job
• ATTACH – detach and reattach to job
• NETWORK_LINK – export and import over
network
• REMAP_DATAFILE – import to a different
datafile.
• REMAP_TABLESPACE – map to new tablespace
Additional Datapump Features
• Filter by using EXCLUDE and INCLUDE
• VERSION – specify version of objects.
Parameters are compatible, latest, or version
number.
DataPump Export Setup
• Make a server file system directory
• Create a database directory that references
the file system directory
• Grant read write privileges to the directory
• Grant privileges for full export
• Create an export parameter file
Default Datapump Directory
• Oracle default datapump directory
DATA_PUMP_DIR
• $ORACLE_HOME/rdbms/log/
• Information is found in table
DBA_DIRECTORIES
Export Preliminary Setup
• mkdir /backup/<database>/datapump
• Set up your environment for ORACLE_HOME
and ORACLE_SID then sqlplus “/ as sysdba”
• create DIRECTORY datapump_dir as
‘/backup/<database>/datapump’ ;
• grant read, write on Directory datapump_dir
to <dp_schema> ;
• grant exp_full_database to <dp_schema> ;
Start Datapump Export Job
• Expdp
parfile=/backup/<database>/datapump/expdp
_<database>_<db_schema>.par
• Expdp <db_schema>/<password>
directory=datapump_dir schema=schema
dumpfile=expdp_<database>_<schema>.dmp
parallel=4
job_name=job_<database>_schema
Example Export Parameter File
• Userid=<dp_schema>/password
• Dumpfile=expdp_<database>_<schema>.dmp
• Logfile=expdp_<database>_<schema>.log
• Directory=datapump_dir
• Schemas=schema
• Job_name=job_expdp_<database>_<schema>
• Status=240
Datapump Import Setup
• Make a server file system directory
• Create a database directory
• Grant read privileges to directory
• Grant privileges for full import
• Create an import parameter file
Import Preliminary Setup
• mkdir /backup/<database>/datapump
• Set up your environment for ORACLE_HOME
and ORACLE_SID then sqlplus “/ as sysdba”
• create DIRECTORY datapump_dir as
‘/backup/<database>/datapump’ ;
• grant read, write on Directory datapump_dir
to <dp_schema>
• grant imp_full_database to <dp_schema>
Start Datapump Import Job
• impdp
parfile=/backup/<database>/datapump/impdp_
<database>_<schema>.par
• impdp <dp_schema>/<password>
directory=datapump_dir
table_exists_action=truncate
dumpfile=impdp_<database>_<schema>.dmp
parallel=4
job_name=job_impdp_<database>_<schema>
Example Import Parfile
• Userid=<dp_schema>/<password>
• Schemas=<schema>
• Exclude=grant
• Directory=datapump_dir
• Dumpfile=expdp_<database>_<schema>.dmp
• Table_exists_action=replace
Some Basic Parameters
• Directory=Datapump_dir - Specify the
datapump directory that has been defined in
the database
• Schemas=User1,User2,User3
• Dumpfile=datapump_job_file%U.dmp
• Tables=Table1,Table2
• Estimate=Statistics – The default is blocks, to
estimate the size of the export
Important Features
• EXCLUDE you can exclude schemas
• REMAP_SCHEMA user1 to user2
• REMAP_TABLESPACE user1_data to
user2_data
• SQLFILE script of sql (DDL) statements
• STATUS list status every few seconds
• JOB_NAME run as an instance job
Exclude
• EXCLUDE=USER – Exclude a specific user and
all objects of that user
• EXCLUDE=GRANT – Exclude definitions or
privileges but not objects of the user
• EXCLUDE=VIEW, PACKAGE, FUNCTION –
Exclude a specific type of object
• EXCLUDE=INDEX:”LIKE ‘EMP%’” – Exclude
indexes whose names start with EMP.
Include
• INCLUDE=PROCEDURE – Include just the
procedure objects
• INCLUDE=TABLE:”IN
(‘MANAGERS’,’FACILITIES’)”
• INCLUDE=INDEX:”LIKE ‘JOB%’”
• Note, INCLUDE and EXCLUDE parameters are
mutually exclusive
Network_Link
• NETWORK_LINK=database_link
• If this is an export then, retrieved data from
the referenced database and written to a
datapump file.
• If this is an import then, retrieved data from
the referenced database is imported into the
current database.
Filters
• QUERY=employees: ”WHERE department_id >10 AND
salary > 10000”
• QUERY=salary:”WHERE manager_id <> 13”
• What can I filter with exclude and include –
DATABASE_EXPORT_OBJECTS,
SCHEMA_EXPORT_OBJECTS, and
TABLE_EXPORT_OBJECTS
• For example – select object_path, comments from
schema_export_objects where object_path not like
‘%/%’ ;
Table_Exists_Action
• Skip
• Append
• Truncate
• Replace
Import Parameters
• Remap_schema=User1:User2
• Remap_tablespace=User1_tblspace:User2_tbl
space
• Transform=OID:N
Do not use the original Oracle Identification.
• Transform=segment_attributes:N
Useful when you do not want to keep the
original storage definition.
Interactively Work With The Job
• Check on the status
• Stop the job
• Restart the job
• Kill the job
Check Status of Datapump Job
• Select job_name, operation, job_mode, state
from user_datapump_jobs ;
• Expdp <dp_schema>/<password>
attach=<job_name>
• Status
• Exit_Client – Exit client but leave job running
• Continue_Client – Resume logging
Kill Datapump Job
• Select job_name, operation, job_mode, state
from user_datapump_jobs ;
• Expdp <dp_schema>/<password>
attach=<job_name>
• Kill_job
Interactive Commands
• Add_File – Add dumpfile to dumpfile set.
• Continue_Client – Restart job if idle.
• Exit_Client – Exit interactive session.
• Filesize – File size of new files (Add_File).
• Help – Interactive session commands.
• Kill_Job – Delete the attached job and exit.
More Interactive Commands
• Parallel – Specify the maximum number of
active workers
– Set to more than twice the number of CPUs
– Worker processes are created as needed
• Reuse_Dumpfiles – Overwrite dump file if it
exists
• Stop_Job – Stop job execution & exit client
• Start_Job – Start or resume current job
Restrictions
• DataPump is an Oracle utility, therefore the
dump file can only be imported by datapump.
• You can still get the error snap shot too old.
• If the job is started using “/ as sysdba”, you
need to know the Oracle database system
password, to check status, kill job, etc.
Oracle 11g Features
• Compression – besides none, and
metadata_only the new features are all, and
data_only
• Encryption – Oracle 10g exported already
encrypted columns. Oracle 11g can encrypt all
of the metadata and/or data
• Data_Options – XML_CLOBS exports XML
columns in uncompressed CLOB format
More Oracle 11g Features
• Partition – On import only. Used to merge
partitions into one table.
Partition options are departition and merge
(all partitions)
• Transportable – permits exporting metadata
for specific tables.
• Remap_Data – enables data to be modified to
obscure sensitive information.
Oracle 11g OEM
• Tab Data Movement
Move Row Data
Export to Export files (expdp)
Import from Export files (impdp)
Import from Database (NETWORK_LINK)
Monitor Export and Import Jobs
Problems Encountered
• The previous run has created the datapump
datafile with the same name as current job
• Space on the tablespaces
– The job suspends
– Make the file extensible or add another datafile
• NFS mounted soft
– Set event 10298 and bounce the database
– Migrate to Oracle 11g
Job Will Not Die
• The datapump processes are killed
– Drop the master table with the same job name
– Delete the datapump file
• The datapump file has been delete or moved
– Cannot now attach to the job
– Drop the master table with the same job name
– Delete the datapump file
Review
• Use datapump as another tool for the DBA
• Take the time to set it up properly
• Learn the basic and rich features
• Create scripts for backups and refreshes
Comments
• Comments or questions
• Thank you for coming
• References:
Oracle Database Utilities 10g Release2(10.2),
Part #B14215-01
Oracle Database Utilities 11g Release1(11.1),
Part#B28319-01
Oracle is a registered trademark of Oracle Corp.
The End
• Last slide:
- Speaker: Gary M. Noble
- Session name: Data Pump
- Contact information for further questions:
[email protected] Be sure to include
UTOUG Training Days in the title.
Thank You

You might also like