Simpler Navigation for Servers and Operating Systems - Please Update Your Bookmarks
Completed: a much simpler Servers and Operating Systems section of the Community. We combined many of the older boards, so you won't have to click through so many levels to get at the information you need. Check the consolidated boards here as many sub-forums are now single boards.
If you have bookmarked forums or discussion boards in Servers and Operating Systems, we suggest you check and update them as needed.
Showing results for 
Search instead for 
Did you mean: 

Oracle Export

Go to solution
Nobody's Hero
Valued Contributor

Oracle Export

We have an oracle database that is about 300 Gbytes. Our export runs a fairly long time and the import time is about 24 hours. Has anyone ever heard of splitting the export into 2 or 3 different processes to speed things up? If so, do you know how this could be accomplished?

Ian Lochray
Respected Contributor

Re: Oracle Export

The only way I could think of to do this would be to have several seperate export processes. The first one would export the whole schema but with rows=n so that no data is exported. This will give you an export of the views, synonyms etc. The other exports would each export part of the schema by specifying a pre-determined list of tables with the "tables=" option on the exp command.
To restore you could import the schema export file and then import the others with "ignore=y" to ignore the fact that the tables already exist.
R. Allan Hicks
Trusted Contributor

Re: Oracle Export

Are you using imp and export for backup? If so you might want to look at Tom Kyte's book Oracle Expert One on One. Imp and exp are good for migrating databases, but they leave a lot to be desired as backup utilities.
"Only he who attempts the absurd is capable of achieving the impossible
Honored Contributor

Re: Oracle Export


in general large exports rules are:
use the filesize parameter
export smaller pieces (using tables= or owner=)
export to a device that does not support seeking

Hopefully, this does not form part of you main backup strategy!

No person was ever honoured for what he received. Honour has been the reward for what he gave (clavin coolidge)
Wim Korten

Re: Oracle Export


Our settings for export are:
# For conventional path exports:
Direct = N
Buffer = 10485760

# For direct path exports:
Direct = Y
Recordlength = 65535

Direct export is VERY fast. If you don't use it now, you could try it. For ins and outs, check the manual.

For import, set your buffer large:
Buffer = 10485760
There is no 'direct' import.
Limited by Technology
Massimo Bianchi
Honored Contributor

Re: Oracle Export

just a couple of notes.

You can use direct=y only if you have no chaset conversion.

If you will import the data on another db or if you have set some env variables, like NLS_LANG, you can have some surprises.

Another important issue is the import: if you use parallel import (many imp) you will have to pay attentio to temp tablespace (for creating indexes) and to rollback segments.

If you dare, you can also try in import commit=y, that can improve performance.

With this parameter, you will have a commit at the end of the table, not after each row.
DRAWBACK: you must have a rollback segment large enough to store the ENTIRE TABLE. And this might be space consumig...