From the data pump node, select the menu item for the Data Pump Export Wizard.
As we said earlier, we are only choosing schema today and in the image below, we are only selecting one schema for export. The 'BARRY' schema.
We won't add any today and let it take everything out of the schema.
In Step 4, the data tab, we can add data filters to all the tables, or individual tables as required. Again for this demonstration, I'm not going to choose any of that and let the data pump export everything.
Step 5 allows us to specify the options for the export. Our primary interest is in the directory for the logging.
Step 6 focuses on our Output directory. Specify the one you want to use here. Remember, as in Step 5, this directory must exist, and you must have the privileges to write to it from your user. Check on the main connections navigator for directories. By default there is a DATA_PUMP_DIR setup, but make sure that the directory exists. We've created a directory called 'BARRYS_DPUMP_DIR' for this example.
Step 7 specifies the schedule which will dump the data for you. We're choosing immediately as our option. You can specify whenever you like for this job to run, repeatedly if required.
Lastly, we have the summary screen, which is split in two parts. The main summary screen shows what actions you are taking and how they will be carried out. The second panel shows the actual PL/SQL which will be run on your behalf to create the data pump job, and dump the data.
Lastly, you can go to your database directory and see your exported file, together with the log file.
Your log file will contain something like this if you're successful. (I've cut a lot out of it as it is long)
Starting "BARRY"."EXPORT_JOB_SQLDEV_327":
Estimate in progress using BLOCKS method...
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
. estimated "BARRY"."MD_ADDITIONAL_PROPERTIES" 3 MB
. estimated "BARRY"."STAGE_TERADATA_TABLETEXT" 2.062 MB
. estimated "BARRY"."MD_DERIVATIVES" 2 MB
. estimated "BARRY"."MD_FILE_ARTIFACTS" 2 MB
...
. estimated "BARRY"."文化大革命" 0 KB
Total estimation using BLOCKS method: 17.75 MB
Processing object type SCHEMA_EXPORT/USER
Processing object type SCHEMA_EXPORT/SYSTEM_GRANT
.....
Processing object type SCHEMA_EXPORT/TABLE/INDEX/STATISTICS/FUNCTIONAL_AND_BITMAP/INDEX_STATISTICS
Processing object type SCHEMA_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
. . exported "BARRY"."MD_ADDITIONAL_PROPERTIES" 13.54 KB 59 rows
. . exported "BARRY"."STAGE_TERADATA_TABLETEXT" 1.197 MB 25 rows
. . exported "BARRY"."MD_DERIVATIVES" 49.60 KB 386 rows
...
. . exported "BARRY"."文化大革命" 0 KB 0 rows
Master table "BARRY"."EXPORT_JOB_SQLDEV_327" successfully loaded/unloaded
******************************************************************************
Dump file set for BARRY.EXPORT_JOB_SQLDEV_327 is:
D:\DEMO\DPUMP\EXPDAT01.DMP
Job "BARRY"."EXPORT_JOB_SQLDEV_327" successfully completed at 15:42:52
So, for today, thats exporting from the Oracle database using the Datapump built into Oracle SQL Developer 3.1 which will be available soon! We'll have part 2 on importing this dump file next.
View comments