http://www.orafaq.com/wiki/Import_Export_FAQ
Oracle Export and Import Utilities FAQ:
NOTE: Users on Oracle 10g and later releases should use the Data Pump expdp and impdp utilities instead of the older imp and exp utilities described in this document.
Contents [hide]
1 What is import/export and why does one need it?
2 How does one use the import/export utilities?
3 Can one export a subset of a table?
4 Can one monitor how fast a table is imported?
5 Can one import tables to a different tablespace?
6 Does one need to drop/ truncate objects before importing?
7 Can one import/export between different versions of Oracle?
8 Can one export to multiple files?/ Can one beat the Unix 2 Gig limit?
9 How can one improve Import/ Export performance?
10 What are the common Import/ Export problems?
[edit]What is import/export and why does one need it?
Oracle's export (exp) and import (imp) utilities are used to perform logical database backup and recovery. When exporting, database objects are dumped to a binary file which can then be imported into another Oracle database.
These utilities can be used to move data between different machines, databases or schema. However, as they use a proprietary binary file format, they can only be used between Oracle databases. One cannot export data and expect to import it into a non-Oracle database.
Various parameters are available to control what objects are exported or imported. To get a list of available parameters, run the exp or imp utilities with the help=yes parameter.
The export/import utilities are commonly used to perform the following tasks:
Backup and recovery (small databases only, say < +50GB, if bigger, use RMAN instead)
Move data between Oracle databases on different platforms (for example from Solaris to Windows)
Reorganization of data/ eliminate database fragmentation (export, drop and re-import tables)
Upgrade databases from extremely old versions of Oracle (when in-place upgrades are not supported by the Database Upgrade Assistant any more)
Detect database corruption. Ensure that all the data can be read
Transporting tablespaces between databases
Etc.
From Oracle 10g, users can choose between using the old imp/exp utilities, or the newly introduced Datapump utilities, called expdp and impdp. These new utilities introduce much needed performance improvements, network based exports and imports, etc.
NOTE: It is generally advised not to use exports as the only means of backing-up a database. Physical backup methods (for example, when you use RMAN) are normally much quicker and supports point in time based recovery (apply archivelogs after recovering a database). Also, exp/imp is not practical for large database environments.
[edit]How does one use the import/export utilities?
Look for the "imp" and "exp" executables in your $ORACLE_HOME/bin directory. One can run them interactively, using command line parameters, or using parameter files. Look at the imp/exp parameters before starting. These parameters can be listed by executing the following commands: "exp help=yes" or "imp help=yes".
The following examples demonstrate how the imp/exp utilities can be used:
exp scott/tiger file=emp.dmp log=emp.log tables=emp rows=yes indexes=no
exp scott/tiger file=emp.dmp tables=(emp,dept)
imp scott/tiger file=emp.dmp full=yes
imp scott/tiger file=emp.dmp fromuser=scott touser=scott tables=dept
Using a parameter file:
exp userid=scott/tiger@orcl parfile=export.txt
... where export.txt contains:
BUFFER=10000000
FILE=account.dmp
FULL=n
OWNER=scott
GRANTS=y
COMPRESS=y
NOTE: If you do not like command line utilities, you can import and export data with the "Schema Manager" GUI that ships with Oracle Enterprise Manager (OEM).
[edit]Can one export a subset of a table?
From Oracle 8i one can use the QUERY= export parameter to selectively unload a subset of the data from a table. You may need to escape special chars on the command line, for example: query=\"where deptno=10\". Look at these examples:
exp scott/tiger tables=emp query="where deptno=10"
exp scott/tiger file=abc.dmp tables=abc query=\"where sex=\'f\'\" rows=yes
[edit]Can one monitor how fast a table is imported?
If you need to monitor how fast rows are imported from a running import job, try one of the following methods:
Method 1:
select substr(sql_text,instr(sql_text,'INTO "'),30) table_name,
rows_processed,
round((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60,1) minutes,
trunc(rows_processed/((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60)) rows_per_min
from sys.v_$sqlarea
where sql_text like 'INSERT %INTO "%'
and command_type = 2
and open_versions > 0;
For this to work one needs to be on Oracle 7.3 or higher (7.2 might also be OK). If the import has more than one table, this statement will only show information about the current table being imported.
Method 2:
Use the FEEDBACK=N import parameter. This parameter will tell IMP to display a dot for every N rows imported. For example, FEEDBACK=1000 will show a dot after every 1000 row.
[edit]Can one import tables to a different tablespace?
Oracle offers no parameter to specify a different tablespace to import data into. Objects will be re-created in the tablespace they were originally exported from. One can alter this behaviour by following one of these procedures:
Pre-create the table(s) in the correct tablespace:
Import the dump file using the INDEXFILE= option
Edit the indexfile. Remove remarks and specify the correct tablespaces.
Run this indexfile against your database, this will create the required tables in the appropriate tablespaces
Import the table(s) with the IGNORE=Y option.
Change the default tablespace for the user:
Revoke the "UNLIMITED TABLESPACE" privilege from the user
Revoke the user's quota from the tablespace from where the object was exported. This forces the import utility to create tables in the user's default tablespace.
Make the tablespace to which you want to import the default tablespace for the user
Import the table
[edit]Does one need to drop/ truncate objects before importing?
Before one imports rows into already populated tables, one needs to truncate or drop these tables to get rid of the old data. If not, the new data will be appended to the existing tables. One must always DROP existing Sequences before re-importing. If the sequences are not dropped, they will generate numbers inconsistent with the rest of the database.
Note: It is also advisable to drop indexes before importing to speed up the import process. Indexes can easily be recreated after the data was successfully imported.
[edit]Can one import/export between different versions of Oracle?
Different versions of the import utility are upwards compatible. This means that one can take an export file created from an old export version, and import it using a later version of the import utility. This is quite an effective way of upgrading a database from one release of Oracle to the next.
Oracle also ships some previous catexpX.sql scripts that can be executed as user SYS enabling older imp/exp versions to work (for backwards compatibility). For example, one can run $ORACLE_HOME/rdbms/admin/catexp7.sql on an Oracle 8 database to allow the Oracle 7.3 exp/imp utilities to run against an Oracle 8 database.
[edit]Can one export to multiple files?/ Can one beat the Unix 2 Gig limit?
From Oracle8i, the export utility supports multiple output files. This feature enables large exports to be divided into files whose sizes will not exceed any operating system limits (FILESIZE= parameter). When importing from multi-file export you must provide the same filenames in the same sequence in the FILE= parameter. Look at this example:
exp SCOTT/TIGER FILE=D:F1.dmp,E:F2.dmp FILESIZE=10m LOG=scott.log
Use the following technique if you use an Oracle version prior to 8i:
Create a compressed export on the fly. Depending on the type of data, you probably can export up to 10 gigabytes to a single file. This example uses gzip. It offers the best compression I know of, but you can also substitute it with zip, compress or whatever.
# create a named pipe
mknod exp.pipe p
# read the pipe - output to zip file in the background
gzip < exp.pipe > scott.exp.gz &
# feed the pipe
exp userid=scott/tiger file=exp.pipe ...
Import directly from a compressed export:
# create a name pipe
mknod imp_pipe p
# read the zip file and output to pipe
gunzip < exp_file.dmp.gz > imp_pipe &
# feed the pipe
imp system/pwd@sid file=imp_pipe log=imp_pipe.log ...
In case of low-performance system, it is better to add RECORDLENGTH parameter with tiny value to ensure that gzip has enough time to extract data before imp reads it:
imp system/pwd@sid RECORDLENGTH=4096 file=imp_pipe log=imp_pipe.log ...
[edit]How can one improve Import/ Export performance?
EXPORT:
Set the BUFFER parameter to a high value (e.g. 2Mb -- entered as an integer "2000000")
Set the RECORDLENGTH parameter to a high value (e.g. 64Kb -- entered as an integer "64000")
Use DIRECT=yes (direct mode export)
Stop unnecessary applications to free-up resources for your job.
If you run multiple export sessions, ensure they write to different physical disks.
DO NOT export to an NFS mounted filesystem. It will take forever.
IMPORT:
Create an indexfile so that you can create indexes AFTER you have imported data. Do this by setting INDEXFILE to a filename and then import. No data will be imported but a file containing index definitions will be created. You must edit this file afterwards and supply the passwords for the schemas on all CONNECT statements.
Place the file to be imported on a separate physical disk from the oracle data files
Increase DB_CACHE_SIZE (DB_BLOCK_BUFFERS prior to 9i) considerably in the init$SID.ora file
Set the LOG_BUFFER to a big value and restart oracle.
Stop redo log archiving if it is running (ALTER DATABASE NOARCHIVELOG;)
Create a BIG tablespace with a BIG rollback segment inside. Set all other rollback segments offline (except the SYSTEM rollback segment of course). The rollback segment must be as big as your biggest table (I think?)
Use COMMIT=N in the import parameter file if you can afford it
Use STATISTICS=NONE in the import parameter file to avoid time consuming to import the statistics
Remember to run the indexfile previously created
[edit]What are the common Import/ Export problems?
ORA-00001: Unique constraint (...) violated
You are importing duplicate rows. Use IGNORE=YES to skip tables that already exist (imp will give an error if the object is re-created).
ORA-01555: Snapshot too old
Ask your users to STOP working while you are exporting or try using parameter CONSISTENT=NO
ORA-01562: Failed to extend rollback segment
Create bigger rollback segments or set parameter COMMIT=Y while importing
IMP-00015: Statement failed ... object already exists...
Use the IGNORE=Y import parameter to ignore these errors, but be careful as you might end up with duplicate rows.
Oracle Export and Import Utilities FAQ:
NOTE: Users on Oracle 10g and later releases should use the Data Pump expdp and impdp utilities instead of the older imp and exp utilities described in this document.
Contents [hide]
1 What is import/export and why does one need it?
2 How does one use the import/export utilities?
3 Can one export a subset of a table?
4 Can one monitor how fast a table is imported?
5 Can one import tables to a different tablespace?
6 Does one need to drop/ truncate objects before importing?
7 Can one import/export between different versions of Oracle?
8 Can one export to multiple files?/ Can one beat the Unix 2 Gig limit?
9 How can one improve Import/ Export performance?
10 What are the common Import/ Export problems?
[edit]What is import/export and why does one need it?
Oracle's export (exp) and import (imp) utilities are used to perform logical database backup and recovery. When exporting, database objects are dumped to a binary file which can then be imported into another Oracle database.
These utilities can be used to move data between different machines, databases or schema. However, as they use a proprietary binary file format, they can only be used between Oracle databases. One cannot export data and expect to import it into a non-Oracle database.
Various parameters are available to control what objects are exported or imported. To get a list of available parameters, run the exp or imp utilities with the help=yes parameter.
The export/import utilities are commonly used to perform the following tasks:
Backup and recovery (small databases only, say < +50GB, if bigger, use RMAN instead)
Move data between Oracle databases on different platforms (for example from Solaris to Windows)
Reorganization of data/ eliminate database fragmentation (export, drop and re-import tables)
Upgrade databases from extremely old versions of Oracle (when in-place upgrades are not supported by the Database Upgrade Assistant any more)
Detect database corruption. Ensure that all the data can be read
Transporting tablespaces between databases
Etc.
From Oracle 10g, users can choose between using the old imp/exp utilities, or the newly introduced Datapump utilities, called expdp and impdp. These new utilities introduce much needed performance improvements, network based exports and imports, etc.
NOTE: It is generally advised not to use exports as the only means of backing-up a database. Physical backup methods (for example, when you use RMAN) are normally much quicker and supports point in time based recovery (apply archivelogs after recovering a database). Also, exp/imp is not practical for large database environments.
[edit]How does one use the import/export utilities?
Look for the "imp" and "exp" executables in your $ORACLE_HOME/bin directory. One can run them interactively, using command line parameters, or using parameter files. Look at the imp/exp parameters before starting. These parameters can be listed by executing the following commands: "exp help=yes" or "imp help=yes".
The following examples demonstrate how the imp/exp utilities can be used:
exp scott/tiger file=emp.dmp log=emp.log tables=emp rows=yes indexes=no
exp scott/tiger file=emp.dmp tables=(emp,dept)
imp scott/tiger file=emp.dmp full=yes
imp scott/tiger file=emp.dmp fromuser=scott touser=scott tables=dept
Using a parameter file:
exp userid=scott/tiger@orcl parfile=export.txt
... where export.txt contains:
BUFFER=10000000
FILE=account.dmp
FULL=n
OWNER=scott
GRANTS=y
COMPRESS=y
NOTE: If you do not like command line utilities, you can import and export data with the "Schema Manager" GUI that ships with Oracle Enterprise Manager (OEM).
[edit]Can one export a subset of a table?
From Oracle 8i one can use the QUERY= export parameter to selectively unload a subset of the data from a table. You may need to escape special chars on the command line, for example: query=\"where deptno=10\". Look at these examples:
exp scott/tiger tables=emp query="where deptno=10"
exp scott/tiger file=abc.dmp tables=abc query=\"where sex=\'f\'\" rows=yes
[edit]Can one monitor how fast a table is imported?
If you need to monitor how fast rows are imported from a running import job, try one of the following methods:
Method 1:
select substr(sql_text,instr(sql_text,'INTO "'),30) table_name,
rows_processed,
round((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60,1) minutes,
trunc(rows_processed/((sysdate-to_date(first_load_time,'yyyy-mm-dd hh24:mi:ss'))*24*60)) rows_per_min
from sys.v_$sqlarea
where sql_text like 'INSERT %INTO "%'
and command_type = 2
and open_versions > 0;
For this to work one needs to be on Oracle 7.3 or higher (7.2 might also be OK). If the import has more than one table, this statement will only show information about the current table being imported.
Method 2:
Use the FEEDBACK=N import parameter. This parameter will tell IMP to display a dot for every N rows imported. For example, FEEDBACK=1000 will show a dot after every 1000 row.
[edit]Can one import tables to a different tablespace?
Oracle offers no parameter to specify a different tablespace to import data into. Objects will be re-created in the tablespace they were originally exported from. One can alter this behaviour by following one of these procedures:
Pre-create the table(s) in the correct tablespace:
Import the dump file using the INDEXFILE= option
Edit the indexfile. Remove remarks and specify the correct tablespaces.
Run this indexfile against your database, this will create the required tables in the appropriate tablespaces
Import the table(s) with the IGNORE=Y option.
Change the default tablespace for the user:
Revoke the "UNLIMITED TABLESPACE" privilege from the user
Revoke the user's quota from the tablespace from where the object was exported. This forces the import utility to create tables in the user's default tablespace.
Make the tablespace to which you want to import the default tablespace for the user
Import the table
[edit]Does one need to drop/ truncate objects before importing?
Before one imports rows into already populated tables, one needs to truncate or drop these tables to get rid of the old data. If not, the new data will be appended to the existing tables. One must always DROP existing Sequences before re-importing. If the sequences are not dropped, they will generate numbers inconsistent with the rest of the database.
Note: It is also advisable to drop indexes before importing to speed up the import process. Indexes can easily be recreated after the data was successfully imported.
[edit]Can one import/export between different versions of Oracle?
Different versions of the import utility are upwards compatible. This means that one can take an export file created from an old export version, and import it using a later version of the import utility. This is quite an effective way of upgrading a database from one release of Oracle to the next.
Oracle also ships some previous catexpX.sql scripts that can be executed as user SYS enabling older imp/exp versions to work (for backwards compatibility). For example, one can run $ORACLE_HOME/rdbms/admin/catexp7.sql on an Oracle 8 database to allow the Oracle 7.3 exp/imp utilities to run against an Oracle 8 database.
[edit]Can one export to multiple files?/ Can one beat the Unix 2 Gig limit?
From Oracle8i, the export utility supports multiple output files. This feature enables large exports to be divided into files whose sizes will not exceed any operating system limits (FILESIZE= parameter). When importing from multi-file export you must provide the same filenames in the same sequence in the FILE= parameter. Look at this example:
exp SCOTT/TIGER FILE=D:F1.dmp,E:F2.dmp FILESIZE=10m LOG=scott.log
Use the following technique if you use an Oracle version prior to 8i:
Create a compressed export on the fly. Depending on the type of data, you probably can export up to 10 gigabytes to a single file. This example uses gzip. It offers the best compression I know of, but you can also substitute it with zip, compress or whatever.
# create a named pipe
mknod exp.pipe p
# read the pipe - output to zip file in the background
gzip < exp.pipe > scott.exp.gz &
# feed the pipe
exp userid=scott/tiger file=exp.pipe ...
Import directly from a compressed export:
# create a name pipe
mknod imp_pipe p
# read the zip file and output to pipe
gunzip < exp_file.dmp.gz > imp_pipe &
# feed the pipe
imp system/pwd@sid file=imp_pipe log=imp_pipe.log ...
In case of low-performance system, it is better to add RECORDLENGTH parameter with tiny value to ensure that gzip has enough time to extract data before imp reads it:
imp system/pwd@sid RECORDLENGTH=4096 file=imp_pipe log=imp_pipe.log ...
[edit]How can one improve Import/ Export performance?
EXPORT:
Set the BUFFER parameter to a high value (e.g. 2Mb -- entered as an integer "2000000")
Set the RECORDLENGTH parameter to a high value (e.g. 64Kb -- entered as an integer "64000")
Use DIRECT=yes (direct mode export)
Stop unnecessary applications to free-up resources for your job.
If you run multiple export sessions, ensure they write to different physical disks.
DO NOT export to an NFS mounted filesystem. It will take forever.
IMPORT:
Create an indexfile so that you can create indexes AFTER you have imported data. Do this by setting INDEXFILE to a filename and then import. No data will be imported but a file containing index definitions will be created. You must edit this file afterwards and supply the passwords for the schemas on all CONNECT statements.
Place the file to be imported on a separate physical disk from the oracle data files
Increase DB_CACHE_SIZE (DB_BLOCK_BUFFERS prior to 9i) considerably in the init$SID.ora file
Set the LOG_BUFFER to a big value and restart oracle.
Stop redo log archiving if it is running (ALTER DATABASE NOARCHIVELOG;)
Create a BIG tablespace with a BIG rollback segment inside. Set all other rollback segments offline (except the SYSTEM rollback segment of course). The rollback segment must be as big as your biggest table (I think?)
Use COMMIT=N in the import parameter file if you can afford it
Use STATISTICS=NONE in the import parameter file to avoid time consuming to import the statistics
Remember to run the indexfile previously created
[edit]What are the common Import/ Export problems?
ORA-00001: Unique constraint (...) violated
You are importing duplicate rows. Use IGNORE=YES to skip tables that already exist (imp will give an error if the object is re-created).
ORA-01555: Snapshot too old
Ask your users to STOP working while you are exporting or try using parameter CONSISTENT=NO
ORA-01562: Failed to extend rollback segment
Create bigger rollback segments or set parameter COMMIT=Y while importing
IMP-00015: Statement failed ... object already exists...
Use the IGNORE=Y import parameter to ignore these errors, but be careful as you might end up with duplicate rows.
No comments:
Post a Comment