Showing results for 
Search instead for 
Did you mean: 

FTP and 2GB problem

Go to solution

FTP and 2GB problem

Please help me in the following issues:

1. How to copy all the files including all subdirectires using FTP. Actually I have copied both Oracle 8i CDs in one of the Test server and installed. Now I want to copy the both CDs data into my production server, having HP-UX 11.0 Version. Now I want to use FTP copy, instead of using CDROM installation of Oracle 8i software. Is there any way to copy all the directories at a time instead of copying file by file using FTP.

2. While importing database I am getting following errors:
Imp-0058,ora-01562,ora-01237,ora-01110,ora-19052,ora-27072 and HP-UX error 28.

These errors basically occured because export has been taken with compress=Y option and table extent size is 2GB size. With compress=n option, I can import the database successfully but most of the tables are having more than 1000 extents.

Is there any way to overcome the number of extents problem?

Thanks and your help is highly appreciated

Patrick Wallek
Honored Contributor

Re: FTP and 2GB problem

FTP can't traverse down through a directory. What you could do instead is use tar (tar -cvf oracle.tar /oracle/install/dir) to create a tar image of your Oracle installation directory and then ftp that file to your other machine and extract from the tar (tar -xvf oracle.tar) image.
Honored Contributor

Re: FTP and 2GB problem


you can try rcp recursive copy. Use rcp with -r option.

rcp -r server1:/directory server2:/path_to_destination

Hope this helps.

Bill Hassell
Honored Contributor

Re: FTP and 2GB problem

As mentioned, ftp can't recurse subdirectories, so rcp is by far the simplest. Use rcp -rp /source_dir prod_machine:/destination

As far as the HP-UX error 28, that means one of your filesystems filled up (errno 28 means no space on device), and this problem was also logged in syslog.log in /var/adm/syslog.

Now since you mentioned that files are near the 2Gb limit, there is strong possibility that the files are slightly larger than 2Gb and if the destination filesystem has not been enabled for large files, you'll get the same errno 28 message. Use fsadm to change the filesystem(s) to allow large files (and use fsadm to display the current largefile setting. Note that hfs must be scanned using the raw logical volume while vxfs should use the filesystem mountpoint.

Bill Hassell, sysadmin
Jack Fan
Regular Advisor

Re: FTP and 2GB problem

1. you need do rcp -rp directory remotehost:/directory to copy subdirectory.
2. It seems reach maximum file size. please enable large file on OS and please set compress=n is better.
3. Or you can post the detail error log to us.

Jack Fan

Re: FTP and 2GB problem

There is an other way.
You create the "big" tables before the import with extent-sizes less then 2G, and after, when you import your data, you add "ignore=yes" to the command line of the imp.
In order to get the SQL statements to create your tables you can start the imp with the option "show=yes". This will show the content of the export file. You will find there the CREATE TABLE commands. You only have to copy those commands and modify the extent-size.
Lee Eun-hee
Occasional Visitor

Re: FTP and 2GB problem

Before you import table, create table like below,
"create table *** (colume type)
storage (initial 1900M next 10M pctincrease 0);"

Above sentence indicates that initial(extent) size is less than 2GB.

If you don't make table before imp, import make table with compress of uninterrupted extents. So the soem extent is up to 2GB.

If you do like that, you may succeed

Re: FTP and 2GB problem

The problem on your import table is the imp utility try to fit all table data into a single initial extent, which has size over 2 GB. So, your problem is the OS doesn't allow imp to create a big extent over 2 GB. So, the way that You can do is to modify the table script (query) to let the initial extent size is lower than 2 GB. Splitting an initial extent into 2 or 3 extents. To get the table script, You can use SHOW=Y on imp utility. Capture the query, modify, rebuild the table and do importing data again.