PARFILE Specify parameter file name. DIRECTORY Directory object to be used for dump and log files. Directory name must be created: ex. CREATE DIRECTORY mydir AS '/home/myname/mydir'; DUMPFILE Specify list of destination dump file names [expdat.dmp] FULL Export entire database [NO]. SCHEMAS List of schemas to export [login schema] TABLES Identifies a list of tables to export. For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995 TABLESPACES Identifies a list of tablespaces to export. LOGFILE Specify log file name [export.log]. CONTENT Specifies data to unload. Valid keyword values are: [ALL], DATA_ONLY and METADATA_ONLY. INCLUDE Include specific object types.For example, INCLUDE=TABLE_DATA. ex. INCLUDE=TABLE:"IN ('EMPLOYEES', 'DEPARTMENTS')" ex INCLUDE=TABLE,INDEX:"LIKE 'EMP%'" For Materialized views two includes are needed Include=Table and Include=View QUERY Predicate clause used to export a subset of a table. ex. QUERY=employees:"WHERE department_id > 10". EXCLUDE Exclude specific object types. ex. EXCLUDE=SCHEMA:"='HR'" FILESIZE Specify the size of each dump file in units of bytes. FLASHBACK_TIME Used to provide a consistent=y in expdp: ex. flashback_time=systimestamp ex. flashback_time="to_timestamp('15-04-2016 12:31:00','DD-MM-YYYY HH24:MI:SS')" FLASHBACK_SCN SCN used to reset session snapshot. To get SCN: SELECT TIMESTAMP_TO_SCN(SYSTIMESTAMP) FROM dual; NOLOGFILE Do not write log file [NO] TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded. VIEWS_AS_TABLES Identifies one or more views to be exported as tables. For example, VIEWS_AS_TABLES=HR.EMP_DETAILS_VIEW. ESTIMATE Calculate job estimates.Valid keyword values are: [BLOCKS] and STATISTICS May not be accurate if COMPRESSION (use ESTIMATE=STATISTICS instead), QUERY, or REMAP_DATA is used. ESTIMATE_ONLY Calculate job estimates without performing the export [NO]. Cannot be used with QUERY parameter SAMPLE Percentage of data to be exported. PARALLEL Change the number of active workers for current job. ENCRYPTION Encrypt part or all of a dump file. Valid keyword values are: ALL, DATA_ONLY, ENCRYPTED_COLUMNS_ONLY, METADATA_ONLY and NONE. ENCRYPTION_ALGORITHM Specify how encryption should be done. Valid keyword values are: [AES128], AES192 and AES256. ENCRYPTION_MODE Method of generating encryption key. Valid keyword values are: DUAL, PASSWORD and [TRANSPARENT]. ENCRYPTION_PASSWORD Password key for creating encrypted data within a dump file. ENCRYPTION_PWD_PROMPT Specifies whether to prompt for the encryption password [NO]. Terminal echo will be suppressed while standard input is read. EXCLUDE Exclude specific object types. ex. FULL=Y EXCLUDE=SCHEMA:"='HR'". NETWORK_LINK Name of remote database link to the source system. ATTACH Attach to an existing job. For example, ATTACH=job_name. CLUSTER Utilize cluster resources and distribute workers across the Oracle RAC [YES]. REMAP_DATA Specify a data conversion function. For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO REUSE_DUMPFILES Overwrite destination dump file if it exists [NO]. SERVICE_NAME Name of an active Service and associated resource group to constrain Oracle RAC resources. VERSION Version of objects to export. Valid keyword values are: [COMPATIBLE], LATEST or any valid database version. METRICS Report additional job information to the export log file [NO]. LOGTIME Specifies that messages displayed during export operations be timestamped. Valid keyword values are: ALL, [NONE], LOGFILE and STATUS KEEP_MASTER Retain the master table after an export job that completes successfully [NO]. ABORT_STEP Stop the job after it is initialized or at the indicated object. Valid values are -1 or N where N is zero or greater. N corresponds to the object's process order number in the master table ACCESS_METHOD Instructs Export to use a particular method to unload data. Valid keyword values are: [AUTOMATIC], DIRECT_PATH and EXTERNAL_TABLE. DATA_OPTIONS Data layer option flags. Valid keyword values are: XML_CLOBS. HELP Display Help messages [NO]. ------------------------------------------------------------------------------------------------ command line example expdp system/passwd DIRECTORY=my_directory DUMPFILE=mydatadump.dmp SCHEMAS=hr TABLES=employees,departments LOGFILE=mydatadump_exp.log parameter file examples SCHEMAS=HR DUMPFILE=mydatadump.dmp DIRECTORY=my_directory LOGFILE=mydatadump_exp.log INCLUDE=TABLE:"IN ('EMPLOYEES', 'DEPARTMENTS')" or TABLES=(EMPLOYEES, DEPARTMENTS) CONTENT=DATA_ONLY EXCLUDE=TABLE:"IN ('COUNTRIES', 'REGIONS')" QUERY=employees:"WHERE department_id !=50 ORDER BY employee_id" FLASHBACK_TIME="TO_TIMESTAMP('27-10-2015 13:16:00', 'DD-MM-YYYY HH24:MI:SS')" FLASHBACK_TIME=SYSTIMESTAMP FLASHBACK_TIME="TO_TIMESTAMP(TO_CHAR(TRUNC(SYSDATE)+20/24,'YYYY-MM-DD'),'YYYY-MM-DD')" INCLUDE=PROCEDURE INCLUDE=INDEX:"LIKE 'EMP%'" INCLUDE=TABLE:"IN ('matview_east_sales','matview_west_sales')" -- materialized views need to be included as both TABLE and VIEW INCLUDE=VIEW:"IN ('matview_east_sales','matview_west_sales')" ------------------------------------------------------------------------------------------------- INTERACTIVE MODE Note: abbreviations are allowed. ADD_FILE Add dumpfile to dumpfile set. CONTINUE_CLIENT Return to logging mode. Job will be restarted if idle. EXIT_CLIENT Quit client session and leave job running. FILESIZE Default filesize (bytes) for subsequent ADD_FILE commands. HELP Summarize interactive commands. KILL_JOB Detach and delete job PARALLEL Change the number of active workers for current job. REUSE_DUMPFILES Overwrite destination dump file if it exists [NO]. START_JOB Start or resume current job. Valid keyword values are: SKIP_CURRENT. STATUS Frequency (secs) job status is to be monitored where the default [0] will show new status when available STOP_JOB Orderly shutdown of job execution and exits the client. Valid keyword values are: IMMEDIATE. ex. Export> STOP_JOB=IMMEDIATE |