Oracle data pumps expdq, impdq

Time:2020-9-16

Using data pump technology to realize logical backup

Overview of data pump

Data pump (data pump) is a high-speed data transmission technology between databases, databases and operating systems (launched by 10g).

Overview of logical backup

Logical backup is to export database objects (such as users, tables, stored procedures, etc.) using export and other tools

Physical backup comparison

Physical backup is to dump Oracle physical files (such as data files, control files, archive log files, etc.)
The purpose of physical backup is to completely restore the entire database, which requires the same backup environment and recovery environment
Logical backup aims to back up the data itself and has strong portability. However, the speed of backup and recovery is low, and large business systems are generally not used

Operation steps

1. Create directory objects (where to import and export)

Create directory directory name (dump_ DIR) as’ absolute path ‘;

2.dump_ Dir directory grant

Grant read, write on directory directory name to import and export users;

SELECT * FROM DBA_DIRECTORIES; 
		#Query all created directories
		DROP directory 
		#Directory name; delete directory

3. Expdq export (expdp and impdp are Oracle server tools, exit sqlplus)

The first type: full export database:
Expdp user name / password @ orcl dumpfile = export file name. DMP directory = directory name full = y logfile = log name. Log; (same directory as DMP)

Second, export by user:
Expdp user name / password @ orcl schemas = user dumpfile = export file name. DMP directory = directory name logfile = log name. Log;

Third, export by table space
Expdp user name / password @ orcl tablespace = table space name dumpfile = export file name. DMP directory = directory name logfile = log name;

Fourth, export according to table
Expdp user name / password @ orcl tables = table name dumpfile = export file name. DMP directory = directory name logfile = log name. Log;
Export table structure
Expdp user name / password @ orcl directory = directory name dumpfile = export file name. DMP content = metadata_ only tables=(a,b)
metadata_ Only means to export only the table structure

The fifth type: export by query criteria:
Expdp user name / password @ orcl tables = table name =’Where condition = value ‘dumpfile = export file name. DMP directory = directory name logfile = log name;

4. Impdq (exit sqlplus)

The first one is to import data into database;
Impdp user name / password directory = Data_ dir dumpfile= expdp.dmp full=y;

The second method is to import by name, from source user name 1 to target user name 2;
Impdp user name / password remap_ Schema = [source user name 1]: [target user name 2] directory = Data_ dir dumpfile= expdp.dmp logfile= impdp.log ;

The third kind:
a. Import tables table1 and table2 from user a to user B;
Impdp user name / password tables = a.table1, a.table2 remap_ SCHEMA=A:B directory=data_ dir dumpfile= expdp.dmp logfile= impdp.log ;
b. Import table spaces tbs01, tbs02, and tbs03 into table space a_ TBS, the data of user B is imported to a, and a new oid is generated to prevent conflicts;
Impdp user name / password remap_ tablespace=TBS01:A_ TBS,TBS02:A_ TBS,TBS03:A_ TBS REMAP_ SCHEMA=B:A FULL=Y transform= oid:n directory=data_ dir dumpfile= expdp.dmp logfile= impdp.log
c. Import table structure

Fourth, import table space;
Impdp user name / password tablespaces = tbs1 directory = Data_ dir dumpfile= expdp.dmp logfile= impdp.log ;

The fifth type: additional data;
Impdp user name / password directory = Data_ dir dumpfile= expdp.dmp schemas=system table_ exists_ action=replace logfile= impdp.log ;
–table_ exists_ Action: the action to perform when the import object already exists. Valid Keywords: skip, append, replace and truncate

5. Parameters

1.expdb

Attach connects to an existing job, such as attach [= job name].

Compression reduces the size of the dump file content, where the valid key values are: all, (metadata_ ONLY), DATA_ Only and none.

Content specifies the data to be unloaded, where the valid key values are: (all), data_ Only and metadata_ ONLY。

DATA_ Options data layer tag, where the only valid value is: XML in CLOB format_ Clobs write XML data type.

Directory the directory object used by dump and log files, that is, the logical directory.

Dumpfile target dump file( expdp.dmp )For example, dumpfile = expdp1.dmp, expdp2.dmp.

Encryption encrypts some or all dump files, and the valid key values are all and data_ ONLY, METADATA_ ONLY,ENCRYPTED_ COLUMNS_ Only or none.

ENCRYPTION_ Algorithm specifies how encryption should be done, where the valid key values are: (aes128), aes192, and aes256.

ENCRYPTION_ Mode the method of generating encryption key, where the valid key values are: dual, password and (transparent).

ENCRYPTION_ Password is the password keyword used to create encrypted column data.

Estimate calculates the job estimate, where the valid key values are: (blocks) and statistics.

ESTIMATE_ Only computes job estimates without performing the export.

Exclude excludes specific object types, such as exclude= TABLE:EMP 。 Example: exclude = [object_ type]:[name_ clause],[object_ type]:[name_ clause] 。

Filesize specifies the size of each dump file in bytes.

FLASHBACK_ SCN is the SCN used to set the session snapshot back to its previous state. –Specify to export table data of specific SCN time.

FLASHBACK_ Time is used to get the time of the SCN closest to the specified time. –Export table data at a specific time point, and pay attention to flashback_ SCN and flashback_ Time cannot be used at the same time.

Full exports the entire database.

Help displays help messages.

Include includes specific object types, such as include = table_ DATA。

JOB_ Name the name of the export job to create.

Logfile log file name( export.log )。

NETWORK_ Link the name of the remote database that is linked to the source system.

Nologfile does not write to the log file.

Parallel changes the number of active workers for the current job.

Parfile specifies the parameter file.

Query is the predicate clause used to export a subset of a table. –QUERY = [schema.][table_ name:] query_ clause。

REMAP_ Data specifies the data conversion function, such as remap_ DATA= EMP.EMPNO : REMAPPKG.EMPNO 。

REUSE_ Dumpfiles overwrites the destination dump file (if the file exists).

The percentage of data that sample will export.

List of scenarios to be exported by schemas (login scheme).

Status the frequency (in seconds) job status to monitor if the default value (0) will show the new status when available.

Tables identifies the list of tables to export – there is only one scenario. –[schema_ name.]table_ name[:partition_ name][,… ]

Tablespaces identifies the list of table spaces to export.

Transportable specifies whether transportable methods can be used, where the valid key values are always, (never).

TRANSPORT_ FULL_ Check verifies the storage segments of all tables.

TRANSPORT_ Tablespaces a list of the table spaces from which to unload metadata.

Version the version of the object to export, where the valid keywords are: (compatible), last, or any valid database version.

ADD_ File adds a dump file to the dump file set.

CONTINUE_ Client returns to record mode. If it is idle, the job is restarted.

EXIT_ Client exits the client session and leaves the job running.

File size add_ The default file size (in bytes) for the file command.

Help summarizes the interactive commands.

KILL_ Jobs detach and delete jobs.

Parallel changes the number of active workers for the current job. PARALLEL=。

A kind of Dumpfiles overwrites the destination dump file (if the file exists).

START_ Job starts / resumes the current job.

Status the frequency (in seconds) job status to monitor if the default value (0) will show the new status when available. STATUS[=interval]。

STOP_ The job sequentially closes the executed job and exits the client. STOP_ Job = immediate will shut down the data pump job immediately.

2.impdq

Attach connects to an existing job, such as attach [= job name].

Content specifies the data to be unloaded, where the valid key values are: (all), data_ Only and metadata_ ONLY。

DATA_ Options data layer tag, where the only valid value is skip_ CONSTRAINT_ Errors – the constraint error is not serious.

Directory the directory object used by dump files, log files and SQL files, that is, the logical directory.

Dumpfile should be downloaded from( expdp.dmp )List of dump files imported in, for example, dumpfile = expdp1.dmp, expdp2.dmp.

ENCRYPTION_ Password the password keyword used to access encrypted column data. This parameter is not valid for network import jobs.

Estimate calculates the job estimate, where the valid keywords are: (blocks) and statistics.

Exclude excludes specific object types, such as exclude= TABLE:EMP 。

FLASHBACK_ SCN is the SCN used to set the session snapshot back to its previous state.

FLASHBACK_ Time is used to get the time of the SCN closest to the specified time.

Full imports all objects from the source.

Help displays help messages.

Include includes specific object types, such as include = table_ DATA。

JOB_ Name the name of the import job to create.

Logfile log file name( import.log )。

NETWORK_ Link the name of the remote database that is linked to the source system.

Nologfile does not write to the log file.

Parallel changes the number of active workers for the current job.

Parfile specifies the parameter file.

PARTITION_ Options specifies how partitions should be converted, where valid keywords are: partition, merge, and (none).

The predicate clause used by query to import a subset of the table.

REMAP_ Data specifies the data conversion function, such as remap_ DATA= EMP.EMPNO : REMAPPKG.EMPNO 。

REMAP_ Datafile redefines data file references in all DDL statements.

REMAP_ Schema loads objects from one scenario into another.

REMAP_ The table name is remapped to another table, such as remap_ TABLE= EMP.EMPNO : REMAPPKG.EMPNO 。

REMAP_ Tablespace remaps a table space object to another table space.

REUSE_ Datafiles initializes a table space if it already exists.

List of scenarios to import by schemas.

SKIP_ UNUSABLE_ Indexes skips indexes set to useless index state.

Sqlfile writes all SQL DDLS to the specified file.

Status the frequency (in seconds) job status to monitor if the default value (0) will show the new status when available.

STREAMS_ Configuration enables loading of stream metadata.

TABLE_ EXISTS_ The action to perform when the action import object already exists. Valid Keywords: (skip), append, replace and truncate.

Tables identifies the list of tables to import.

Tablespaces identifies the list of table spaces to import.

Transform is applied to the metadata transformation of the applicable object. The valid conversion keyword is: segment_ Attributes, storage, oid and pctspace.

Transportable is used to select the options for transferring data movement. The valid keywords are always and (never). Network only_ Link mode import operation is valid.

TRANSPORT_ Datafiles list of data files imported in transportable mode.

TRANSPORT_ FULL_ Check verifies the storage segments of all tables.

TRANSPORT_ Tablespaces a list of the table spaces from which to load metadata. Network only_ Link mode import operation is valid.

Version the version of the object to export, where the valid keywords are: (compatible), last, or any valid database version. Network only_ Link and sqlfile are valid.

CONTINUE_ Client returns to record mode. If it is idle, the job is restarted.

EXIT_ Client exits the client session and leaves the job running.

Help summarizes the interactive commands.

KILL_ Jobs detach and delete jobs.

Parallel changes the number of active workers for the current job. PARALLEL=。

START_ Job starts / resumes the current job. START_ JOB=SKIP_ Current skips any action performed when the job stopped before starting the job.

Status the frequency (in seconds) job status to monitor if the default value (0) will show the new status when available. STATUS[=interval]。

STOP_ The job sequentially closes the executed job and exits the client. STOP_ Job = immediate will shut down the data pump job immediately.