Use Oracle Data Pump
Oracle Data Pump offers very fast bulk data and metadata movement between Oracle databases and Autonomous Database on Dedicated Exadata Infrastructure.
Data Pump Import lets you import data from Data Pump files residing on the Oracle Cloud Infrastructure Object Storage and Oracle Cloud Infrastructure Object Storage Classic. You can save your data to your Cloud Object Store and use Oracle Data Pump to load data to Autonomous Database.
For a "Try it out" alternative to reading the following topics, you can go through the Lab 9: Migrate with Data Pump in the Oracle Autonomous Database Dedicated for Fleet Administrators Workshop.
- Export Data from Your Existing Oracle Database
First you use Oracle Data Pump Export to export your existing Oracle Database schemas. Then you use Oracle Data Pump Import to migrate them to Autonomous Database. - Upload the Export Files to Cloud Object Storage
Before you can import the data you exported from the source Oracle Database, you need to upload the export files to cloud object storage. - Import Data Using Oracle Data Pump
Oracle recommends using the latest Oracle Data Pump version for importing data from Data Pump files into your Autonomous Database as it contains enhancements and fixes for a better experience. - Access Log Files for Data Pump Import
The log files for Data Pump Import operations are stored in the directory you specify with the data pumpimpdp
directory parameter.
Parent topic: Move Data Into Autonomous Database
Export Data from Your Existing Oracle Database
First you use Oracle Data Pump Export to export your existing Oracle Database schemas. Then you use Oracle Data Pump Import to migrate them to Autonomous Database.
Oracle recommends using the following Data Pump Export parameters for faster and easier migration to Autonomous Database:
exclude=cluster, db_link
parallel=n
schemas=schema name
dumpfile=export%l.dmp
Oracle Data Pump Export provides several export modes, Oracle recommends using the schema mode for migrating to Autonomous Database. You can list the schemas you want to export by using the schemas parameter.
For a faster migration, export your schemas into multiple Data Pump files and use parallelism. You can specify the dump file name format you want to use with the dumpfile parameter. Set the parallel parameter to at least the number of CPUs you have in your Autonomous Database.
The exclude and data_options parameters ensure that the object types not available in Autonomous Database are not exported and table partitions are grouped together so that they can be imported faster to Autonomous Database.
Oracle recommends that you collect the up-to-date statistics before executing the
expdp
command. The dbms_stats
package provides multiple procedures to collect the latest statistics. Statistics are automatically added to the export dumpfile, and they help determine the size of objects and optimize parallelism during the import. For more information, see DBMS_STATS Operational Notes.
The following example exports the SH schema from a source Oracle Database for migration to an Autonomous Database with 16 CPUs:
expdp sh/sh@orcl \
exclude=cluster, db_link \
parallel=16 \
schemas=sh \
dumpfile=export%l.dmp \
encryption_pwd_prompt=yes
Notes for Data Pump parameters:-
If during the export with
expdp
you use theencryption_pwd_prompt=yes
parameter then also useencryption_pwd_prompt=yes
with your import and input the same password at theimpdp
prompt to decrypt the dump files (remember the password you supply during export). The maximum length of the encryption password is 128 bytes. -
The
dumpfile
parameter supports the%L
and%l
wildcards in addition to the legacy%U
and%u
wildcards. For example,dumpfile=export%L.dmp
.Use the
%L
or%l
wildcard for exports from Oracle Database Release 12.2 and higher. This wildcard expands the dumpfile file name into a 3-digit to 10-digit, variable-width incrementing integer, starting at 100 and ending at 2147483646.Use the legacy
%U
or%u
wildcard for exports from Oracle Database prior to Release 12.2. If you use this option and more than 99 dump files are needed, you must specify multiple dumpfile names, each with the%U
or%u
parameter.
You can use other Data Pump Export parameters, such as compression
, depending on your requirements. For more information on Oracle Data Pump Export see Oracle Database Utilities.
Parent topic: Use Oracle Data Pump
Upload the Export Files to Cloud Object Storage
Before you can import the data you exported from the source Oracle Database, you need to upload the export files to cloud object storage.
You can upload the export files to an existing storage bucket in Oracle Cloud Infrastructure Object Storage or an existing storage container in Oracle Cloud Infrastructure Object Storage Classic. Or you use the following procedure to create a new storage bucket and upload the export files to it.
-
Sign in to your Oracle Cloud Account at cloud.oracle.com.
-
From the Oracle Cloud Infrastructure left navigation list choose Object Storage and then choose Object Storage from the sublist.
-
Pick a compartment to create the storage bucket in.
-
Click Create Bucket.
-
In the Create Bucket dialog, give the bucket a name and then click Create Bucket.
-
After the bucket is created, click its name in the list of buckets to display its Bucket Details page.
-
In the Objects box, click Upload Objects.
-
In the Upload Objects dialog, click the select files link in the Choose Files From Your Computer box.
-
In the file browser, navigate to and select your export files. Then, click Open.
-
In the Upload Objects dialog, click Upload Objects to start uploading the files you selected.
-
After the uploads complete, close the Upload Objects dialog.
Parent topic: Use Oracle Data Pump
Import Data Using Oracle Data Pump
Oracle recommends using the latest Oracle Data Pump version for importing data from Data Pump files into your Autonomous Database as it contains enhancements and fixes for a better experience.
Download the latest version of the Oracle Instant Client Basic Package and Tools Package (which includes Oracle Data Pump) for your platform from Oracle Instant Client Downloads. See the installation instructions on the platform install download page for the installation steps required after you download Oracle Instant Client.
In Oracle Data Pump version 18.3 and later, the credential
argument authenticates Data Pump to the Cloud Object Storage service you are using for your source files. The dumpfile
argument is a comma delimited list of URLs for your Data Pump files.
Data Pump Import versions 12.2.0.1 and earlier do not have the credential
parameter. If you are using an older version of Data Pump Import you need to define a default credential property for Autonomous Database and use the default_credential
keyword in the dumpfile
parameter.
In Oracle Data Pump, if your source files reside in Oracle Cloud Infrastructure Object Storage, you can use Oracle Cloud Infrastructure native URIs or the Swift URIs. See Cloud Object Storage URI Formats for details on these URI formats.
Parent topic: Use Oracle Data Pump
Access Log Files for Data Pump Import
The log
files for Data Pump Import operations are stored in the directory you specify with the data pump
impdp
directory parameter.
To access the log file you need to move the log file to your Cloud Object
Storage using the procedure DBMS_CLOUD.PUT_OBJECT
. For example, the following PL/SQL block
moves the file import.log
to your Cloud Object Storage:
BEGIN
DBMS_CLOUD.PUT_OBJECT
(
credential_name => 'DEF_CRED_NAME',
object_uri => 'https://objectstorage.us-ashburn-1.oraclecloud.com/n/namespace-string/b/bucketname/o/import.log',
directory_name => 'DATA_PUMP_DIR',
file_name => 'import.log');
END;
/
In this example, namespace-string
is the Oracle
Cloud Infrastructure object storage namespace and
bucketname
is the bucket
name. See Understanding
Object Storage Namespaces for more
information.
For more information, see DBMS_CLOUD for Objects and Files.
Parent topic: Use Oracle Data Pump