Operating System - HP-UX
1819718 Members
2948 Online
109606 Solutions
New Discussion юеВ

Re: split a large oracle export file

 
SOLVED
Go to solution
Donny Jekels
Respected Contributor

split a large oracle export file

has anyone done a split on a large oracle export (3gig) into multiple smaller files, and successfully reconstructed the file?
"Vision, is the art of seeing the invisible"
6 REPLIES 6
Steven E. Protter
Exalted Contributor

Re: split a large oracle export file

We've done it using the standard utiltities but for some reason(Oracle's quality control?) it was less reliable than doing it as one big chunky file on a filesystem with largefiles enabled.

newfs -F vxfs -o largefiles /dev/vg01/roraback

This issue persisted through 8.1.7.0.0 and 8.1.7.4.0

SEP

Steven E Protter
Owner of ISN Corporation
http://isnamerica.com
http://hpuxconsulting.com
Sponsor: http://hpux.ws
Twitter: http://twitter.com/hpuxlinux
Founder http://newdatacloud.com
Zafar A. Mohammed_1
Trusted Contributor

Re: split a large oracle export file

Check this link
http://forums.itrc.hp.com/cm/QuestionAnswer/1,,0x90ae50dde50cd71190050090279cd0f9,00.html

Thanks
Zafar

[Moderator edit: The above link is no longer valid.]

malay boy
Trusted Contributor
Solution

Re: split a large oracle export file

Here you are from metalink :

Problem Description:
====================

You are attempting to perform a large export. When the export file grows beyond
2GB, the export fails with the following errors reported in the export log file:

EXP-00015: error on row of table , column , datatype
EXP-00002: error in writing to export file
EXP-00000: Export terminated unsuccessfully

Examine the file size of the export dump file. It should be approximately
21474M or 2.1G. This is because prior to 8.1.3 there is no large file support
for Oracle Import, Export, or SQL*Loader utilties.


Search Words:
=============
2G EXPORT EXP IMPORT IMP GIGABYTES

Solution Description:
=====================

This is a restriction of the Oracle utilities as of the time this article was
published. There is some confusion over the >2GB patch released by
Oracle which allows datafiles to be >2GB datafiles. This patch and file size
only applies to the RDBMS itself, not its utilties. However, some
workarounds are available.


Solution Explanation:
=====================

The Oracle export dump files still are restricted to less than 2GB as specified
in the product documentation. The same holds true for import files and SQL*
Loader data files. Here are some workarounds for exporting data that results
in dump files of a size >2GB:


Workaround #1:
--------------
Investigate to see if there is a way to slit up the export at the schema level.
Perhaps you can export the schema with the highest number of objects in a
separate export in order to fit under the 2GB limit. Also, investigate whether
certain large tables can be exported separately.


Workaround #2:
--------------

!!! IMPORTANT: THESE EXAMPLES ONLY WORK IN KORN SHELL (KSH) !!!

Use the UNIX pipe and split commands:

Export command:

echo|exp file=>(split -b 1024m - expdmp-) userid=scott/tiger tables=X

Note: You can put any "exp" parameters. This is working only in ksh and
has been tested on Sun Solaris 5.5.1.

Import command:

echo|imp file=<(cat expdmp-*) userid=scott/tiger tables=X

Splitting and compressing at the same time:

Export command:

echo|exp file=>(compress|split -b 1024m - expdmp-) userid=scott/tiger tables=X

Import command:

echo|imp file=<(cat expdmp-*|zcat) userid=scott/tiger tables=X


Workaround #3:
--------------
This is almost the same as above, but, in a three-step implementation using
explicit UNIX pipes without the split command, only relying on compress:

Export command:

1) Make the pipe
mknod /tmp/exp_pipe p

2) Compress in background
compress < /tmp/exp_pipe > export.dmp.Z &
-or-
cat p | compress > output.Z &
-or-
cat p > output.file & )

3) Export to the pipe
exp file=/tmp/exp_pipe userid=scott/tiger tables=X

Import command:

1) Make the pipe
mknod /tmp/imp_pipe p

2) uncompress in background
uncompress < export.dmp.Z > /tmp/imp_pipe &
-or-
cat output_file > /tmp/imp_pipe &

3) Import thru the pipe
imp file=/tmp/imp_pipe userid=scott/tiger tables=X
There are three person in my team-Me ,myself and I.
Yogeeraj_1
Honored Contributor

Re: split a large oracle export file

hi,
I would do it at the Oracle Database Level itself!

Solution 1:
===========
Export to a PIPE and have compress and split read the pipe. The result is a couple of 500meg compressed files that consistute the export.

We use the CSH script below for a full export and then test the integrity of the export by doing a full import show = y.
This gives a file with all of my source code and ddl.


#!/bin/csh -vx

setenv UID /
setenv FN exp.`date +%j_%Y`.dmp
setenv PIPE /tmp/exp_tmp_ora817.dmp

setenv MAXSIZE 500m
setenv EXPORT_WHAT "full=y COMPRESS=n"

echo $FN

cd /nfs/ORA-BACKUP/export_mydb_ora817
ls -l

rm expbkup.log export.test exp.*.dmp* $PIPE
mknod $PIPE p

date > expbkup.log
( gzip < $PIPE ) | split -b $MAXSIZE - $FN. &
#split -b $MAXSIZE $PIPE $FN. &

exp userid=$UID buffer=20000000 file=$PIPE $EXPORT_WHAT >>& expbkup.log
date >> expbkup.log


date > export.test
cat `echo $FN.* | sort` | gunzip > $PIPE &
#cat `echo $FN.* | sort` > $PIPE &
imp userid=sys/change_on_install file=$PIPE show=y full=y >>& export.test
date >> export.test

tail expbkup.log
tail export.test

ls -l
rm -f $PIPE



Solution 2:
===========
use

exp files=(file1.dmp,file2.dmp,file3.dmp,....) filesize=N


and export will create many dmp files, each N bytes in size


Hopefully, this is not your primary Oracle Database backup strategy but RMAN.

Hope this helps!

regards
Yogeeraj
No person was ever honoured for what he received. Honour has been the reward for what he gave (clavin coolidge)
Raynald Boucher
Super Advisor

Re: split a large oracle export file

Nobody has yet mentioned the option of exporting to tape.

While all options provided will work, tape works great and conserves disk space.

Take care!
Donny Jekels
Respected Contributor

Re: split a large oracle export file

thanks all, however I cannot go back and re-export the database. we have this large file right now.

I was looking for a process to chop up "this" large file. for transportability.

the spilt works well. and off course we have to cat the chopped up files back to the original.

I did a cksum, before and after, works great.

Next time I will scream at the dba's. today I will leave them in peace.

Donny
"Vision, is the art of seeing the invisible"