Using Oracle 10g's Data Pump
Learning objective
After completing this topic, you should be able to recognize how to use Oracle's Data Pump to export and import data.
1. Exporting data with Data Pump
Oracle's Data Pump utility allows you to move large amounts of data from one database to another.
You can use it to export
- databases
- schemas
- tables
- databases
- Exporting the entire database is known as a full export. You can only perform a full export if you have the
EXP_FULL_DATABASE
role. TheSYS
,ORDSYS
,ORDPLUGINS
,CTXSYS
,M
DSYS
,LBACSYS
, andXDB
schemas are not exported, because the metadata they contain is exported in other objects. - schemas
- Users with the
EXP_FULL_DATABASE
role can export all schemas but other users can only export their own schema. This export does not include cross-schema references, unless you choose to export both of the schemas in the reference. - tables
- Tables, partitions, and dependent objects are exported, but type definitions for columns are not, because they should already be part of the instance that will be importing the table. You can only export tables from other schemas if you have the
EXP_FULL_DATABASE
role.
Data Pump works by writing the data to files, which can then be imported by another database.
Suppose you want to export the EMPLOYEES and JOBS tables. You start by clicking the Maintenance tab in Enterprise Manager.
Now you want to begin the export process.
You select Export to Files to begin the export process.
This opens the Export: Export Type page. Here you must choose whether you are going to export the entire database, database schemas, or database tables. You must also confirm your access credentials.
You want to export a database table, so you select the Tables option, and click Continue.
Enterprise Manager lists the tables that have been selected for export. So far no tables have been selected.
You now need to add tables for export.
You click the Add button to add a table.
This opens the Export: Add Tables page. You can use this page to search for the table you want to export, by specifying the table name and the schema that contains it.
The tables you want to export are in the HR
schema, so you enter HR
in the Schema field, leave the Table field blank, and click Go.
All of the tables in the HR
schema are listed.
You select the EMPLOYEES and JOBS tables, and click the Select button to continue.
The EMPLOYEES and JOBS tables are now listed on the Export page.
You click Next to continue to the next stage of the export process.
The Export: Options page allows you to define how the export will proceed.
- Maximum Number of Threads in Export Job
- Estimate disk space
- Optional File
- Maximum Number of Threads in Export Job
- The number of threads that are devoted to the export job determines how fast it will be carried out. As you increase the number of threads used for this task, you reduce the resources available for other work.
So, before deciding how many threads to assign, you should consider the size of the export job, and the likely server workload when it will be carried out. - Estimate disk space
- Enterprise Manager can estimate the size of the objects to be exported by counting the number of blocks they contain or by examining the most recent set of table statistics- the latter giving a more accurate answer. This estimate tells you approximately how many bytes will be taken up by the table row data.
- Optional File
- By default, the Data Pump utility automatically creates a log file to record the process of the export. You can specify the directory path and filename to use, or choose not to generate the file at all.
The Export: Options page also contains an Advanced options section.
Here, you can decide whether to export normal data, metadata, or all data from the selected objects. You can also exclude certain objects from the set to be exported.
You can use the Flashback options to export the data as it stood at a particular time, or after a particular change, identified by a System Change Number (SCN), or you can apply a query to the selected objects, and only export the results.
You accept the default options, scroll to the bottom of the page, and then click Next to continue.
Next is the Export: Files page. The Directory Object drop-down list allows you to specify the directory the data will be exported to, and the File Name field specifies the name of the file that will contain the data. In this case, you stick with the default values for each option, the DATA_FILE_DIR
directory object and the EXPDAT%U.dmp
file.
You can specify multiple files and file locations on this page. Oracle will also create multiple files automatically, so that each of the threads specified on the Options page is used to create a file.
The substitution variable %U
at the end of a filename, such as the default EXPDAT%U.dmp
, means that the first file created will be EXPDAT01.dmp, the second EXPDAT01.dmp, and so on.
You click Next to continue.
The Export: Schedule page has two sections - Job Parameters and Job Schedule.
In the Job Parameters section, you specify the job parameters by typing the export job name and description into the text fields provided. If the Job Name field is left blank, the system will assign a name automatically.
The Job Schedule section allows you to specify when the export job will be carried out. You can either choose to perform the export immediately, or use the text fields and drop-down list boxes to set a date and time for the export.
Finally, the review page shows you the settings you have selected. It also displays the PL/SQL code that will be used to carry out the export.
Supplement
Selecting the link title opens the resource in a new browser window.
View the complete PL/SQL code for the export.
Question
Which of these export options can you set for the Data Pump utility?
Options:
- Maximum number of threads to be used in the export
- Export the database objects as they stand at a particular time
- The time for the export to start
- The time for the export to finish
Answer
The Data Pump utility allows you to set the maximum number of threads to be used in the export, the particular database state that is to be exported, and the time when the export is to be carried out.
Option 1 is correct. You can decide the maximum number of threads that will be used in the export process. This determines how long the export will take, and what portion of system resources will be dedicated to the export.
Option 2 is correct. Instead of exporting the current state of the database, you can export it as it stands at a particular time, or at a specified System Change Number (SCN).
Option 3 is correct. You can start the export immediately, or reschedule it.
Option 4 is incorrect. You can set the time for the export job to start, but the finish time will depend on the size of the job and can't be configured in advance.
Question
You want to export the data from the LOCATIONS table to another database, using Enterprise Manager.
Select the option that starts the process of exporting a table.
Options:
- Click Administration, then the Export to Files link. Select the Tables option, and then click Continue.
- Click Maintenance, then the Export to Files link. Select the Tables option, and then click Continue.
Answer
You click Maintenance, then the Export to Files link. You select the Tables option, and then click Continue.
2. Loading data with Data Pump
You can also use Data Pump to import data into your database. You start by selecting the Import from Files link on the Maintenance tabbed page in Enterprise Manager.
Suppose that you are engaged in a different data migration process and you need to import files stored earlier in the directory specified by the DM_PMML_DIR directory object.
On the Import: Files page, you first specify the name and location of the source files you are going to import. You wish to import files also named using the EXPDAT%U
specifier, but have to change the default directory object.
Next you need to decide whether you are going to import the complete file, or specific schemas or tables from the source files.
You click Continue to move forward to the Import: Re-Mapping page.
The information that you import describes a schema, tablespace, or data file.
You can retain these data structures when you import the data, so that an EMPLOYEES schema in the source file becomes a new EMPLOYEES schema in your database, for example.
Or you can remap the imported data, so that the data you import is added to an existing data structure in your database.
You decide to retain the original data structures and click Next.
The Import: Options page allows you to decide how many threads will be used by the data import process. You can also decide whether to create a log file to track the import process.
The advanced import options allow you to decide whether to import both metadata and data, which objects to import, and what should happen to the import process if it encounters another version of the object in your database.
You accept the default settings, and click Next to continue.
The Import: Schedule page allows you to create a name and description for the import job, and decide when it will start.
You do not want to make any changes, so you click Next to continue.
The import choices you have made are summarized on the Import: Review page, which also contains the PL/SQL code that will execute the import.
Question
Which options are available to you when you are importing files using Data Pump?
Options:
- Estimating the size of the import job
- Import all or only part of the data
- Scheduling the import to take place later
Answer
When you are importing files using Data Pump, you can choose to import part of a tablespace, or schedule the import job to take place later.
Option 1 is incorrect. There is no facility when importing files to estimate the size of the import. However, since the files are on the server their size can be checked independently, and you can increase the number of threads available to the import job.
Option 2 is correct. You can choose to import only the metadata, for example, or to import the table data without the metadata.
Option 3 is correct. You can perform the import immediately or schedule it for a later time.
Question
You have decided to import the data from the EXPORT.DMP file into your database.
Select the option that imports this file.
Options:
- Click Import from Files, type
EXPORT.DMP
in the File Name field, and click Continue. You then click the Next button three times, before clicking the Submit Job button. - Click Data Pump, and then Data Pump Import. Type
EXPORT.DMP
in the File Name field, and click Continue. You then click the Next button three times, before clicking the Submit Job button.
Answer
You click Import from Files, type EXPORT.DMP
in the File Name field, and click Continue. You then click the Next button three times, before clicking the Submit Job button.
3. Creating directory objects
Data Pump is based on the server, rather than the client, and the file locations you specify use server-based directory paths. These paths must be created as directory objects, mapping a name to a directory path on the file system.
You can create new directory objects in Enterprise Manager at any point where you need to specify a directory object. This could be when you are exporting data to files or importing from files.
Suppose you are exporting a database table, and you want to create a new directory object. On the Export: Files page, you click the Create Directory Object button.
You then add the name and directory path to the object in the text fields on the Export: Create Directory Object page, and click OK.
You can also create a directory object directly in SQL using this syntax.
CREATE DIRECTORY dpump
AS '/usr/apps/datafile'
You automatically have read/write privileges to the directories you create, and can grant these privileges to other users.
Question
Identify the characteristics of directory objects.
Options:
- Directory objects can use directory paths from the server or client
- You automatically have read/write permissions for any directory objects you create
- You can create directory objects in SQL or using Enterprise Manager
Answer
You can create directory objects in SQL or using Enterprise Manager, and automatically have read/write permissions to any directory objects you create.
Option 1 is incorrect. Data Pump is based on the server, so the directory paths used in the directory must also be server-based.
Option 2 is correct. You have read/write permissions to the directory objects you create, and you can also give these rights to other users.
Option 3 is correct. You can create directory objects in Enterprise Manager, at any point where you have to specify a directory object to be used in a process. You can also create them in SQL.
Question
Which of these SQL statements creates a directory object?
Options:
- CREATE DIRECTORY dpump_dir1
AS 'usr/apps/datafiles'; - CREATE DIRECTORY
WHERE DIRECTORY_NAME = 'dpump_dir1'
AND LOCATION = 'usr/apps/datafiles'; - NEW DIRECTORY dpump_dir1
AS 'usr/apps/datafiles';
Answer
You use the commandCREATE DIRECTORY dpump_dir1
AS
'usr/apps/datafiles';
to create a directory object.
Option 1 is correct. This command will create a directory object called dpump_dir1 using the usr/apps/datafiles directory path.
Option 2 is incorrect. This is not a correctly formed SQL command.
Option 3 is incorrect. This is not a correctly formed SQL command.
Question
You are starting to import some files into your database, and want to create a new directory object with the name NEW_DIR and directory path /datafiles.
Select the option that creates the object.
Options:
- Select Import from Files, and click Add Directory Object. Type
NEW_DIR
in the Name field, and/datafiles
in the Operating System Directory field, and click OK. - Select Import from Files, and click Create Directory Object. You then type
NEW_DIR
in the Name field, and/datafiles
in the Operating System Directory field, and click OK.
Answer
You select Import from Files, and click Create Directory Object. You then type NEW_DIR
in the Name field, and /datafiles
in the Operating System Directory field, and click OK.
Summary
Oracle's Data Pump allows you to move large amounts of data from one database to another. You can move an entire database, one or more schemas, or selected tables. You can dedicate multiple threads to the export job, create a log file to track its progress, and schedule it to take place immediately or at some point in the future.
A data export creates a data file. The data in this file can then be imported by another database. You can import an entire file, or extract particular tables or schemas. As these objects are extracted, they can be recreated as they were in their source database, or the data they contain can be mapped into existing database objects.
Data Pump is based on the server. As you import or export data, instead of specifying a path to the file concerned, you reference a directory object, which contains a server path. These directory objects can be created in Enterprise Manager or in SQL.
No comments:
Post a Comment